WO2017164107A1 - Dispositif et procédé pour établir un paramètre - Google Patents

Dispositif et procédé pour établir un paramètre Download PDF

Info

Publication number
WO2017164107A1
WO2017164107A1 PCT/JP2017/010880 JP2017010880W WO2017164107A1 WO 2017164107 A1 WO2017164107 A1 WO 2017164107A1 JP 2017010880 W JP2017010880 W JP 2017010880W WO 2017164107 A1 WO2017164107 A1 WO 2017164107A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
format
changing
display
parameter
Prior art date
Application number
PCT/JP2017/010880
Other languages
English (en)
Japanese (ja)
Inventor
鈴木 真人
Original Assignee
ヤマハ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from JP2016062826A external-priority patent/JP2017174361A/ja
Priority claimed from JP2016062827A external-priority patent/JP2017174362A/ja
Application filed by ヤマハ株式会社 filed Critical ヤマハ株式会社
Publication of WO2017164107A1 publication Critical patent/WO2017164107A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04RLOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
    • H04R3/00Circuits for transducers, loudspeakers or microphones

Definitions

  • the present invention relates to an apparatus and method for setting parameters using virtual operation objects displayed on a display device attached to the electronic device in various electronic devices such as an audio mixer.
  • an audio mixer console has a plurality of channel strips, and each channel strip has a number of controls including a fader control for volume control and a knob control for gain adjustment.
  • a conventional mixer basically, one type of parameter is assigned as a control target for each operator. Therefore, the operation interface has a complicated configuration in which a large number of knobs, faders, and the like are arranged, and it is difficult for a user unfamiliar with the operation of the mixer to determine which parameter is assigned to which of the large number of operators.
  • Patent Document 1 describes that a parameter value is adjusted using a virtual operator image displayed on a touch panel as an example of an operation interface of a mixer.
  • the virtual operator image displayed on the touch panel is used as an operation interface
  • the virtual operator image has a problem that it is difficult to perform a sensory operation as compared to the physical operator.
  • erroneous operations such as these are likely to occur.
  • the operation interface that uses the virtual operator image displayed on the touch panel is configured to adjust the parameter value by touching the surface of a flat touch panel (touch operation). It is easy to cause erroneous operations such as changing.
  • Patent Document 1 in order to prevent erroneous operation and improve operability, the virtual operator image is selected by the first touch operation on the virtual operator image, and the first touch operation is continued. It describes that it is possible to adjust the value of the parameter corresponding to the selected virtual operator image by performing another second touch operation in this state.
  • the digital mixer disclosed in Patent Document 1 or the like parameters assigned to a physical operation element or a virtual operation element image can be changed. Therefore, the number of physical operation elements or virtual operation element images can be reduced, and the configuration of the operation interface can be simplified. It is desirable that the operation for changing the parameter assigned to the physical operation image or the virtual operation image is intuitive and easy to understand even for a user who is unfamiliar with the operation of the mixer, for example.
  • the present invention has been made in view of the above-described points, and an object thereof is to improve the user's operational feeling with respect to the displayed virtual operator object. It is another object of the present invention to control parameter values using virtual operation objects in an easy-to-understand manner.
  • the method according to the present invention includes displaying a virtual operation object on a display, detecting a user operation related to the object, and displaying the object according to the first operation of the user.
  • To the first format to change the display format of the object to the second format in response to the second operation of the user after the change to the first format, and to the second format According to the third operation of the user after the change, the parameter value corresponding to the object is changed.
  • the display format of the object is changed to the first format according to the first operation of the user related to the virtual operation object, and then the display format of the object is changed to the first format.
  • the display format is changed from the first format to the second format in response to the second operation of the user after, and in response to the third operation of the user after the change to the second format,
  • the value of the parameter corresponding to the object is changed.
  • the mode of the third operation is determined according to the third operation of the user after the change to the second format, and the determined third operation Assigning a parameter type corresponding to the form of the object to the object, and changing the value of the parameter corresponding to the object comprises changing the value of the parameter of the type assigned to the object. It may be.
  • the virtual operation object can be compared with the configuration in which the virtual operation object is displayed individually for each parameter type.
  • the configuration of the used operation interface can be simplified.
  • the method according to the present invention displays a virtual operation object, detects a user operation related to the object, and determines the operation according to the user operation on the object.
  • the operation mode is determined, and parameter types corresponding to the determined operation mode are assigned to the object.
  • an operation using a virtual operation object is compared to a configuration in which a virtual operation object object is displayed for each parameter type.
  • the configuration of the interface can be simplified, and the parameter type assigned to the virtual operation object can be changed by a simple and easy-to-understand method that uses different operation modes.
  • the present invention can be implemented and configured not only as a method invention but also as a setting device invention including components configured to realize the function of each step constituting the method. Furthermore, the present invention can also be implemented as a non-transitory computer readable storage medium storing instructions executable by one or more processors to perform the above method.
  • the “reaction” of the virtual operation object corresponding to the difference between the first operation and the second operation of the user is visually changed by changing the display format from the first format to the second format. Therefore, it is possible to give the user a feeling as if he / she is operating a physical object that shows a response according to the operation. Therefore, in the operation of the parameter value using the virtual operation object, it is possible to improve the user's operational feeling for the object.
  • the parameter value can be adjusted according to the third operation after the change to the second format, the user can be cautioned and confirmed as to which virtual operation object is being operated, Incorrect operation can be prevented.
  • FIG. 8A to 8F are diagrams for explaining a change in display format of a virtual operation object according to another example.
  • FIGS. 8A to 8F are diagrams for explaining a change in display format of a virtual operation object according to another example.
  • FIGS. The flowchart which shows another example of the display control process of a virtual operation object.
  • FIG. 1 The block diagram explaining the conceptual structural example of the setting apparatus according to the 2nd viewpoint of this invention.
  • (A)-(d) is a figure explaining the example of the form of operation of the virtual operation object in the setting apparatus according to this 2nd viewpoint.
  • the flowchart which shows the display control processing example of the virtual operation object in the setting apparatus according to this 2nd viewpoint.
  • the flowchart which shows another example of the display control process of the virtual operation object in the setting apparatus according to this 2nd viewpoint.
  • (A)-(d) is a figure explaining the example of another form of operation of the virtual operation object in the setting apparatus according to this 2nd viewpoint.
  • FIG. 1 is a block diagram for explaining an overall configuration example of a setting device according to an embodiment of the present invention according to the first aspect.
  • the setting device 10 includes a display unit (display) 11 that displays a virtual operation object, a detection unit 12 that detects a user operation related to the virtual operation object, and detection of the user based on the detection by the detection unit 12.
  • the first display control unit 13 that performs control to change the display format of the object to the first format in response to the first operation, and the detection result after the change to the first format based on the detection by the detection unit 12.
  • the second display control unit 14 that performs control to change the display format of the object to the second format in accordance with the second operation of the user, and after the change to the second display format based on the detection by the detection unit 12 In accordance with the third operation of the user, a value changing unit 15 that changes the value of the parameter corresponding to the object is provided.
  • the setting device 10 is incorporated and mounted in an acoustic signal processing device 100 as shown in FIG.
  • the setting device 10 may be incorporated and mounted in, for example, a general-purpose personal computer or a tablet-type terminal device.
  • the processor device provided in the device (electronic device) incorporating the function of the setting device 10 may be configured to execute a program for realizing the function as the setting device 10.
  • the setting device 10 may be composed of a dedicated hardware device (such as an integrated circuit) configured to be able to execute the function or operation thereof, and is configured of such a dedicated hardware device (such as an integrated circuit).
  • the setting device 10 may be incorporated in a device (electronic device) such as the acoustic signal processing device 100.
  • FIG. 2 is a block diagram illustrating an example of an electrical hardware configuration of the acoustic signal processing device 100 in which the setting device 10 illustrated in FIG. 1 is incorporated.
  • the acoustic signal processing apparatus 100 is, for example, a mixing apparatus that performs mixing of acoustic signals of a plurality of channels, volume level adjustment, effect addition, and the like.
  • the acoustic signal processing device 100 may be configured by a dedicated hardware device, or may be configured by a computer device that can execute an acoustic signal processing program, such as a general-purpose personal computer or a tablet terminal device. .
  • the acoustic signal processing device 100 includes a CPU (central processing unit) 1, a memory 2, a touch panel display 3, an audio interface (I / F) 4, and a signal processing device 5, and these units are connected via a communication bus 6. Connected.
  • the CPU 1 executes various programs stored in the memory 2 to control the operation of the acoustic signal processing apparatus 100.
  • the memory 2 includes a ROM (Read Only Memory), a RAM (Random Access Memory), a hard disk (HDD), a solid state drive (SDD) and the like.
  • the memory 2 stores the various programs and various data including various parameter values used for signal processing of the DSP 5.
  • a touch panel type display (display) 3 (hereinafter simply referred to as “touch panel”) includes a display mechanism that performs various displays based on the control of the CPU 1 and a detector or detection mechanism that detects a touch operation that touches the screen (that is, a display). Upper touch detection surface).
  • the display mechanism is a liquid crystal panel or the like, and includes an LCD, an organic EL, or the like.
  • the detector or the detection mechanism (touch detection surface on the display) is configured to be able to individually detect and recognize two or more touches (multi-touch) on the screen. Therefore, the touch panel 3 is configured to realize the functions of both the display unit (display) 11 and the detection unit 12 illustrated in FIG.
  • the audio I / F 4 includes an AD converter, a DA converter, an audio input interface, and an audio output interface.
  • the acoustic signal processing apparatus 100 inputs an acoustic signal from an input device (not shown) via the audio I / F 4 and outputs the processed acoustic signal to an output device (not shown).
  • the signal processing device 5 is composed of, for example, a DSP (Digital Signal Processor) or a signal processing device virtually realized by software stored in the CPU 1 and the memory 2.
  • the signal processing device 5 performs signal processing on the input acoustic signal by executing a signal processing program.
  • the signal processing includes, for example, mixing processing, volume level adjustment processing, equalizing processing, and the like. This signal processing is controlled based on the values of various parameters stored in the memory 2.
  • the acoustic signal processing apparatus 100 may include other components not shown, such as a network interface for connecting to a computer network.
  • FIGS. 3A to 3F show an example of a virtual operation object displayed on the touch panel 3, and how the display format of the virtual operation object is changed over time by the processing of FIG. 4 described later.
  • FIG. 3 (a) to 3 (f) the screen 30 of the touch panel 3 displays manipulator images (“fader images”) 31a, 31b, 31c, and 31d that simulate fader manipulators as a plurality of virtual operation objects.
  • Each virtual operation object 31 is configured to be able to move an image 32 representing a knob portion along a predetermined slide direction (vertical direction in FIG. 3).
  • a plurality of components such as a virtual operation object are not distinguished from each other, only numbers (such as “31”) without alphabetic suffixes are used as reference symbols.
  • FIG. 4 is a flowchart illustrating a processing example executed by the CPU 1 when an operation for selecting the virtual operation object 31 is performed.
  • the CPU 1 detects one or more finger contact points by the detection mechanism of the touch panel 3 (step S1), and selects one virtual operation object 31 based on the detected one or more contact points (step S2). ). It is determined by the processes in steps S1 and S2 that a touch operation (first operation) for selecting one desired object 31 has been performed.
  • reference numerals 33a, 33b, and 33c indicate contact points of three fingers with respect to the touch panel 3.
  • the touch operation (first operation) for selecting the virtual operation object 31 is performed by holding the specific one knob portion 32 with one or more fingers in order to perform an operation of grasping the specific one knob portion 32. It consists of performing a touch operation for selection.
  • the gripping operation refers to touching the vicinity of the display portion of the knob portion 32 with one or more fingers in order to select the desired knob portion 32 of one object 31 displayed on the touch panel 3, and then the selected object It consists of the operation
  • FIG. 3B shows how the object 31b is selected.
  • the CPU 1 detects the contact points 33a, 33b, 33c existing within a predetermined distance from a certain knob part 32b as an operation to try to grip the knob part 32b, and the contact points 33a, 33b. , 33c is specified to identify one knob 31b closest to the center position 34 (step S2).
  • the CPU 1 detects a plurality of contact points that are close to each other as an operation to grip one knob portion 32 and selects the knob portion 32b that is closest to the center position 34 of the contact points. It may be configured to. A plurality of contact points may not be detected at the same time.
  • steps S1 and S2 if a plurality of contact points are detected within a predetermined time and the plurality of contact points exist within a predetermined distance from a certain knob portion 32, the knob portion 32 is grasped. It may be determined that the operation to be performed (selection operation, that is, the first operation) has been performed. As another example, if only one contact point is detected within a predetermined time in step S1 and the one contact point exists within a predetermined distance from a certain knob portion 32, the knob portion 32 is grasped. It may be determined that the operation to be performed (selection operation, that is, the first operation) has been performed.
  • step S3 the CPU 1 changes the display format of the selected virtual operation object 31b to the first format.
  • the CPU 1 changes the display format of the knob portion 32b of the virtual operation object 31b from the still image (still ⁇ ⁇ image) before the selection operation to the moving image that appears to vibrate the knob portion 32b.
  • a known technique may be applied to the method for displaying such a moving image. Note that the change to a moving image is not limited to changing (replacing) a still image to a special moving image, and the moving image is displayed by moving or displacing the same image used for the still image. May be.
  • 3C shows a state where the display format of the knob portion 32 is changed to the first format in step S3.
  • the contact points 33a, 33b, and 33c are not shown.
  • the image of the knob portion 32b vibrates vertically and / or horizontally with a constant or random width and speed.
  • the width, speed, and direction of the vibration may be appropriately set as long as the user can visually recognize the vibration.
  • the reaction of the object 31b in response to an operation for gripping the knob portion 32b can be visually expressed.
  • the object 31b has not yet received an operation for moving the knob portion 32b in the vertical direction, that is, an operation for adjusting the parameter value.
  • the processing of the CPU 1 and step S3 realizes the function as the first display control unit 13 in FIG.
  • the detected contact points 33a, 33b, 33c are mutually connected.
  • the interval is gradually narrowed by the user performing an operation of narrowing a plurality of fingers in order to grasp the knob portion 32.
  • the CPU 1 determines whether the user has performed a predetermined second operation on the object.
  • the second operation includes narrowing the interval between the plurality of fingers touching the touch detection surface so that the selected object is grasped by the plurality of fingers of the user.
  • step S4 the CPU 1 detects a change in the position of the contact points 33a, 33b, and 33c, and the distance between the contact points 33a, 33b, and 33c (in other words, the distance between a plurality of fingers) is greater than a predetermined distance. It is determined whether or not it has become smaller, and thereby, it is determined whether or not an operation (second operation) has been performed so that the user's fingers can grasp that the knob portion 32 has been grasped.
  • the predetermined interval may be set to a width corresponding to the image width of the knob portion 32.
  • the CPU 1 can determine the time point when the user firmly grasps the knob part 32, that is, the time point when the user touches the edge of the image of the knob part 32 by the step S4. While the finger interval is larger than the predetermined interval (No in step S4), the CPU 1 repeats step S4. Accordingly, during that time, the knob portion 32b continues to vibrate.
  • the CPU 1 changes the display format of the knob portion 32b to the second format (step S4). S5).
  • the display in the second format performed by the CPU 1 in step S5 is to stop the vibration of the knob portion 32b, that is, return the knob portion 32b to the still image again and change the display color of the knob portion 32b. Consists of.
  • the still image of the knob portion 32b may be the same before and after the display format is changed, or may be different before and after the display format is changed.
  • the display color is, for example, a color different from the display color before selection and the display color changed in step S1 (display color at the time of selection).
  • FIG. 3D shows a state in which the mutual distance between the contact points 33a, 33b, and 33c is smaller than a predetermined distance.
  • the selected virtual operation object 31b is in a state where the position of the knob portion 32b can be changed according to the operation, that is, the parameter value can be adjusted.
  • the CPU 1 and the process of step S5 constitute the second display control unit 14 in FIG.
  • step S6 the CPU 1 determines whether or not the user has performed a predetermined third operation on the object.
  • the third operation includes moving the operation position of the knob portion 32b by moving the user's finger holding the knob portion 32b in a predetermined direction. For example, each finger (contact point 33a, 33b, 33c) on which the user performs an operation of grasping the knob portion 32b is moved upward or downward along the slide direction (hereinafter referred to as “movement of touch operation”).
  • the CPU 1 detects the movement of the contact points 33a, 33b, 33c.
  • the step S6 detects the movement of the center position 34 of the plurality of contact points 33a, 33b, 33c.
  • the movement of the touch operation may be an operation of moving all the contact points 33a, 33b, and 33c, or an operation of moving any one or more of the contact points 33a, 33b, and 33c.
  • the CPU 1 detects only the component in the vertical direction included in the “movement of the touch operation”.
  • step S7 the CPU 1 updates the value of the corresponding parameter in the memory 2 according to the detected movement amount and direction of the touch operation, and displays the display position of the knob portion 32b.
  • the parameter is, for example, the volume level of the acoustic signal.
  • the knob portion 32b moves upward as shown in FIG.
  • the DSP 5 performs a volume level adjustment process based on the updated volume level value.
  • the CPU 1 and the process of step S7 constitute the value changing unit 15 in FIG.
  • the CPU 1 loops through steps S5 to S7 until all fingers (that is, contact points 33a, 33b, and 33c) that are performing an operation of grasping the knob portion 32b are separated from the touch panel 3 (No in step S8). That is, according to the movement of the touch operation, the knob part 32b is moved and the corresponding parameter value is adjusted. The user can change the parameter value by moving the finger if only at least one of the fingers performing an operation of grasping the knob portion 32b is in contact with the touch panel 3.
  • the CPU 1 ends the process of editing the parameter value of the virtual operator object 31b (step S9).
  • the CPU 1 returns the display color of the knob portion 32b from the display color in the second format to the display color in the non-selected state as shown in FIG.
  • the object is first changed to the first form (vibration) and then changed to the second form (vibration stop). It is possible to visually represent 31 reactions, specifically vibration and subsequent vibration stop. As a result, the user will feel as if he / she refused to grasp the virtual operation object 31 for the grasping operation first, and then refused to accept the operation after being firmly grasped. . For this reason, in the operation of the virtual operation object 31 displayed on the touch panel 3, it is possible to give the user a feeling as if a physical object showing a response to its own operation is being operated. Therefore, the operational feeling of the virtual operation object 31 can be improved.
  • knob portion 32 first vibrates according to the gripping operation at the time of selection, and the parameter value can be adjusted after the vibration is stopped according to the subsequent gripping operation. It is possible to prompt attention and confirmation as to whether or not the user is operating, and prevent erroneous operation.
  • FIGS. 5A to 5F show display examples of virtual operation objects according to another embodiment.
  • the screen 50 of the touch panel 3 is a parametric equalizer operation screen
  • the virtual operation object is an object of the characteristic curve 51 for adjusting the frequency characteristic of the acoustic signal and the characteristic. It consists of a combination of objects of a plurality of control points 52a, 52b, 52c on the curve.
  • the virtual operation objects that are directly operated by the user are the objects of the control points 52a, 52b, and 52c
  • the object of the characteristic curve 51 is indirectly linked to the operation of the control points 52a, 52b, and 52c. Will be manipulated.
  • the horizontal axis indicates the frequency band
  • the vertical axis indicates the gain value of the volume level.
  • the center frequency is set for any three frequency bands according to the left and right positions (positions in the horizontal axis direction) of the individual control points 52a, 52b, and 52c on the characteristic curve 51.
  • the gain values of the corresponding frequency bands are set according to the vertical positions (positions in the vertical axis direction) of the control points 52a, 52b, and 52c.
  • FIG. 6 is a flowchart illustrating an example of a display change process executed by the CPU 1 in response to an operation for selecting a control point.
  • steps S10 to S14 are substantially the same as steps S1 to S5 of FIG. That is, the CPU 1 detects one or more contact points 53a, 53b, 53c related to the operation to be grasped (first operation) (step S10), and for example, as shown in FIG.
  • control point 52b closest to the center position 54 of 53a, 53b, 53c is selected (step S11), and the display format of the selected control point 52b is the first format, that is, as shown in FIG.
  • the moving image is changed to a vibrating image and the display color is changed (step S12).
  • step S13 the CPU 1 determines whether or not the user has performed a predetermined second operation on the object, for example, whether or not the intervals between the contact points 53a, 53b, and 53c are smaller than the predetermined interval. To do.
  • the CPU 1 shows the display format of the control point 52b in the second format, that is, FIG.
  • the control point 52b is changed to a moving image that vibrates, and the display color is changed (step S14).
  • the control point 52 (object) can freely move in the vertical direction (vertical direction) and the horizontal direction (horizontal direction) according to the user's touch operation after the second operation (that is, the third operation).
  • the CPU 1 determines the parameter type to be adjusted according to the initial movement direction of the movement of the touch operation by the processing of steps S15 to S17, and adjusts the value of the parameter of the determined type.
  • the initial movement is the first movement of the user touch operation on the control point 52 (object) after the step S14 (after changing to the second form). Until the first movement is detected, the CPU 1 repeats steps S14 and S15 through the NO route of step S15.
  • the detected initial movement direction (for example, the initial movement direction of the center position 54 of the contact points 53a, 53b, and 53c) is determined.
  • the initial movement direction is, for example, the vertical direction or the horizontal direction.
  • the CPU 1 regards the movement of the touch operation as corresponding to either the vertical or horizontal direction.
  • the operation amount detected according to the movement (initial movement) of the touch operation is, for example, a movement direction and a movement amount (movement distance).
  • step S17 the CPU 1 fixes the moving direction of the control point 52b in the determined initial movement direction, and determines the parameter type to be controlled according to the initial movement direction.
  • the left and right positions of the control point 52 are associated with the center frequency, and the up and down position is associated with the gain value. Therefore, when the initial movement direction is the left and right direction, the parameter type to be controlled is determined as the center frequency.
  • the parameter type to be controlled is determined as a gain value.
  • step S17 the CPU 1 updates the value of the determined parameter in the memory 2 according to the movement amount of the touch operation, and updates the display for moving the position of the control point 52b.
  • the shape of the characteristic curve is changed by moving the control point. For example, when the initial movement of the control point 52b is in the vertical direction, the user moves the control point 52b in the vertical direction (downward in the figure) and corresponds to the control point 52b as shown in FIG.
  • the gain value of the frequency band can be adjusted.
  • the display update and the parameter value are updated only in accordance with the movement of the vertical component that is the initial movement direction. That is, the parameter type to be adjusted can be fixed depending on the initial movement direction.
  • the CPU 1 may perform a guide display 70 indicating the moving direction on the touch panel 3 as shown in FIG. 7, for example, in response to fixing the moving direction of the control point 52 in step S17.
  • the fixed moving direction can be clearly shown to the user.
  • the user can grasp which type of parameter is to be controlled and which direction the knob can be moved by displaying the moving direction.
  • the CPU 1 loops through steps S17 and S18 until all fingers (contact points 53a, 53b, and 53c) that are performing an operation of grasping the control point 52b are separated from the touch panel 3 (No in step S18).
  • the adjustment operation of the parameter value corresponding to the control point 52b is accepted.
  • the CPU 1 edits the parameter value of the virtual operation object 31b.
  • the process ends (step S19). In accordance with the end of the editing process, the CPU 1 may return the display color of the control point 52b to the display color before selection as shown in FIG.
  • step S17 executes a function corresponding to the value changing unit 15 in FIG. 1, and in relation to this, in steps S15 to S17, the second operation is performed.
  • a subsequent user touch operation that is, a third operation mode is determined, and a parameter type to be controlled is assigned to the object in accordance with the determined operation mode. That is, the function performed by the processing shown in the flowchart of FIG. 6 is represented by a functional block diagram similar to FIG. 1 as shown in FIG.
  • a determination unit 113a and an allocation unit 114a are added to the configuration of FIG.
  • the determination unit 113a performs an operation performed on the object by the user after the second display control unit 14 changes the display format of the object to the second format. It is detected as an operation, and it is determined which of the plurality of forms predetermined as the third operation form is the third operation performed this time.
  • the function of the determination unit 113a corresponds to the processing of steps S15 and S16 in FIG.
  • the plurality of forms determined in advance as the third operation form are, for example, forms in which the initial movement direction of the touch operation is either up or down or left and right.
  • the assigning unit 114a assigns a parameter type corresponding to the determined third operation mode to the object.
  • the value changing unit 15 changes the value of the type of parameter assigned to the object by the assigning unit 114a according to the operation amount of the third operation.
  • the functions of the assigning unit 114a and the value changing unit 15 correspond to the processing in step S17 in FIG.
  • the virtual operation objects 31 and 51 may have any shape.
  • the “first operation” and the “second operation” may be detected based on contact of one finger.
  • the knob portion 32 of the virtual operation object 31 or the control point 52 of the virtual operation object 51 close to one detected contact point is selected.
  • the operation of touching the virtual operation object 31 with one finger corresponds to the “first operation”.
  • the “second operation” determined in steps S4 and S13 is, for example, that the contact area (or pressure) of the finger changes by a predetermined threshold or more than the contact area (or pressure) of the first operation. It may be.
  • the determination performed in step S4 or S13 may be determination of whether the area of one finger touching the virtual operation object 31 is smaller than a predetermined area.
  • the finger contact area may be measured by the pressure of the finger in contact with the touch panel 3.
  • the “second operation” determined in steps S4 and S13 may include, for example, moving one finger touching the virtual operation object 31. In that case, for example, steps S4 and S13 may be configured to branch to Yes when the finger moves in a direction approaching the virtual operation object 31.
  • the operation to be selected is not limited to an operation for selecting one of the plurality of virtual operation objects 31 and 51.
  • the operation to be selected is not limited to an operation for selecting one of the plurality of virtual operation objects 31 and 51.
  • the operation to be selected is not limited to an operation for selecting one of the plurality of virtual operation objects 31 and 51.
  • the operation to be selected is not limited to an operation for selecting one of the plurality of virtual operation objects 31 and 51.
  • the operation to be selected is not limited to an operation for selecting one of the plurality of virtual operation objects 31 and 51. For example, when only one virtual operation object 31, 51 is displayed on the touch panel 3, starting the touch operation on the virtual operation object 31, 51 is also included in the selection operation.
  • the display format of the virtual operation objects 31 and 51 in the steps S3 and S12 is changed to the first format in accordance with the operation to be selected (operation to be grasped). Not limited to this, it may be performed when a predetermined user operation (first operation) related to the virtual operation objects 31 and 51 is detected.
  • changing the display format of the virtual operation objects 31 and 51 in the steps S3 and S12 to the first format includes, for example, rotating the image, moving the image randomly, May be changed to other types of moving images, such as moving so as to avoid a touch operation (contact point) or blinking an image.
  • changing to the first format in steps S3 and S12 includes, for example, changing the display color of the image of the knob portion 32, the control point 52, etc., changing the display size, It may include changing the image shape of the manual operation objects 31 and 51, or changing to another display format. Moreover, you may combine suitably the deformation
  • changing the display format of the virtual operation objects 31 and 51 in the steps S5 and S14 to the second format is, for example, “stopping vibration” in the embodiment.
  • the change of one type is not limited to returning to the state before the change.
  • Changing to the second form of steps S5 and S14 may include changing the speed and width of vibration, for example.
  • changing to the second format in steps S5 and S14 may include, for example, changing to a moving image of a format different from the first format (for example, changing vibration to blinking).
  • changing to the second format in steps S5 and S14 includes, for example, changing the display color, changing the display size, changing the image shape of the virtual operation objects 31, 51, and blinking display. Or changing to another display format.
  • the steps S5 and S14 are configured to gradually change from the first form to the second form in response to detection of the second operation after the change to the first form. May be.
  • the vibration is gradually delayed according to the narrowing of the interval between fingers, and finally the vibration can be stopped.
  • the steps S4 and S13 are not limited to the configuration for determining whether the interval between the fingers is smaller than a predetermined interval, but the virtual operation object after being changed to the first format.
  • the process branches to Yes (the process of changing to the second format in steps S5 and S14 is executed). It's okay.
  • the user's predetermined arbitrary operation (second operation) related to the virtual operation objects 31 and 51 after the change to the first format is, for example, the rotation of one or more fingers touching the touch detection surface And an operation of increasing the interval between a plurality of fingers touching the touch detection surface.
  • the steps S4 and S13 are performed when the touch operation detected in the steps S1 and S10 continues for a predetermined time or longer after the virtual operation objects 31 and 51 are changed to the first format. (The process which changes to the 2nd format of said step S5 and S14 may be performed) may be comprised.
  • the steps S4 and S13 detect the pressure of the touch operation and branch to Yes according to the pressure (execute processing for changing to the second form of the steps S5 and S14).
  • Configuration or detection of the movement speed of the touch operation and branching to Yes according to the movement speed (execution of the process of changing to the second format in steps S5 and S14), or other touch operation related The structure may branch to Yes according to a physical quantity or an arbitrary combination of these various physical quantities.
  • the steps S8 and S18 branch to Yes when one of the plurality of fingers performing the grasping operation leaves the touch panel 3 (the editing end of the steps S9 and S19 is terminated). Execute). In another embodiment, the steps S8 and S18 branch to Yes when another finger touches the touch panel 3 in addition to the plurality of fingers performing the grasping operation (in the steps S9 and S19). (End editing). In another embodiment, when the steps S8 and S18 detect a predetermined operation such as an operation of rotating a finger performing the grasping operation (an operation of moving a contact point in an arbitrary rotation direction), for example. May be configured to branch to Yes (execution end of steps S9 and S19 is executed).
  • a predetermined operation such as an operation of rotating a finger performing the grasping operation (an operation of moving a contact point in an arbitrary rotation direction), for example. May be configured to branch to Yes (execution end of steps S9 and S19 is executed).
  • the device for inputting an operation related to the virtual operation objects 31 and 51 may be a motion capture device configured to detect the movement of the user's hand.
  • Specific methods for detecting hand movement include detecting movement by irradiating the hand with infrared rays, etc., and detecting movement by attaching a device that can detect acceleration to the body including the hand. Can be considered.
  • the virtual operation objects 31 and 51 may be displayed on a head mounted display, for example.
  • the present invention uses, for example, a motion capture device and a head-mounted display in that it can effectively give the user a feeling as if a physical object responding to the operation is being operated. Suitable for use in value changing operations in an augmented reality environment.
  • the setting device 10 is not limited to adjusting a parameter value related to acoustic signal processing, and adjusts a value by operating a virtual operation object, for example, brightness adjustment of illumination. Any value may be used for adjusting the value.
  • the apparatus or method according to the present invention displays a virtual operation object, detects a user operation related to the virtual operation object, and detects the operation, The method comprises determining the detected operation form and assigning a parameter type corresponding to the determined operation form to the virtual operation object.
  • FIG. 9 is a block diagram illustrating an example of the overall configuration of the setting apparatus 101 according to the second aspect.
  • the setting apparatus 101 includes a display unit 111 that displays a virtual operation object, a detection unit 112 that detects an operation related to the virtual operation object, and determines the detected operation form according to the detection of the operation. And a determination unit 113 that assigns a parameter type according to the determined operation mode to the virtual operation object.
  • the setting device 101 is incorporated into the acoustic signal processing device 100 having a hardware configuration as shown in FIG.
  • the function of the determination unit 113 is that the CPU 1 of the acoustic signal processing device 100 detects the operation detected by the detection unit 112 and the volatile memory or HDD included in the memory 2 of the acoustic signal processing device 100. It can be realized by comparing operations stored in a storage medium such as SSD.
  • the present invention is not limited to this, and a processor that performs a collation operation as the determination unit 113 may be separately prepared.
  • the function of the allocation part 114 may be implement
  • the present invention is not limited to this, and a processor that performs an allocation operation as the allocation unit 114 may be separately prepared. Note that the determination unit 113 and the allocation unit 114 execute substantially the same or similar functions as the determination unit 113a and the allocation unit 114a of FIG.
  • the touch panel 3 (FIG. 2) has the display unit (display) 111 and the detection shown in FIG.
  • the unit 112 is configured to realize both functions.
  • the setting device 101 may be incorporated in, for example, a general-purpose personal computer or a tablet-type terminal device.
  • a processor device provided in a device (electronic device) incorporating the function of the setting device 101 may be configured to execute a program for realizing the function as the setting device 101. .
  • the setting device 101 may be configured by a dedicated hardware device (such as an integrated circuit) configured to be able to execute the function or operation thereof, and is configured by such a dedicated hardware device (such as an integrated circuit).
  • the setting device 101 may be incorporated in a device (electronic device) such as the acoustic signal processing device 100.
  • FIGS. 10A to 10D show an example of the virtual operation object displayed on the touch panel 3 (FIG. 2), and the parameters assigned to the virtual operation object are changed by the processing of FIG. 10 described later.
  • FIG. 9A to 9D on the screen 130 of the touch panel 3, an operator image for controlling the compressor as a virtual operation object 131 is displayed.
  • the compressor is a kind of effect imparting processing module, and has an effect of compressing the volume difference of the acoustic signal.
  • the effect of the compressor is determined, for example, by a combination of three types of parameters: ratio, threshold level, and attack time.
  • the virtual operation object 131 is composed of one image representing the knob part.
  • the virtual operation object 131 is associated with three types of parameters, ratio, threshold, and attack, and any one of the three types of parameters is converted into the virtual operation object 131 by the process of FIG. assign. Further, a graph 132 indicating the characteristics of the compressor is displayed on the left side of the virtual operation object 131. The horizontal axis of the graph 132 indicates the ratio, and the vertical axis indicates the threshold level.
  • FIG. 11 is a flowchart showing an example of processing executed by the CPU 1 (FIG. 2) when a touch operation is performed on the virtual operation object 131.
  • the touch operation is an operation in which the user's finger is brought into contact with the display location of the virtual operation object 131 (the image of the knob portion) on the touch panel 3 (FIG. 2). Moreover, moving the touched finger on the touch panel 3 (moving the contact point) is referred to as “movement of touch operation”.
  • reference numerals 133a, 133b, and 133c indicate contact points of the touch operation.
  • the CPU 1 (FIG. 2)
  • the CPU 1 may detect a contact point existing within a predetermined distance from the virtual operation object 131 as a touch operation related to the virtual operation object 131.
  • step S22 the CPU 1 (FIG. 2) determines whether or not the detected contact points 133a, 133b, and 133c have been initially moved.
  • the initial movement is the movement of the first touch operation after detection of the contact points 133a, 133b, and 133c. That is, the CPU 1 detects the movement of the touch operation in the step S22.
  • the physical quantity detected according to the movement of the touch operation is, for example, a moving direction and a moving amount (moving distance). Until there is an initial movement (No in step S22), the CPU 1 loops step S22. If there is an initial movement (Yes in step S22), the CPU 1 determines the operation mode in step S23.
  • the form of operation is the direction of initial movement of the touch operation.
  • the direction is, for example, any of the three types of rotation, up and down, and left and right.
  • the CPU 1 determines which of the three types corresponds to the direction.
  • the process of step S22 by the CPU 1 corresponds to the operation of the determination unit 113.
  • a plurality of types of operation modes (three types of initial movement directions) are stored in a storage medium such as a volatile memory, HDD, or SSD included in the memory 2 (FIG. 2).
  • step S24 to S26 the CPU 1 (FIG. 2) changes the parameter type assigned to the virtual operation object 131 according to the determined operation form, that is, the initial direction of the touch operation.
  • the CPU 1 (FIG. 2) assigns a ratio to the virtual operation object 131 when the initial direction of the touch operation is rotation (step S24), and when the initial direction of the touch operation is the vertical direction, the virtual operation is performed.
  • a threshold is assigned to the object 131 (step S25), and when the initial motion direction of the touch operation is the left-right direction, an attack time is assigned to the virtual operation object 131 (step S26).
  • the CPU 1 may perform a display indicating the direction of initial movement in each of steps S24 to S26.
  • the processing of steps S24 to S26 by the CPU 1 corresponds to the operation of the assigning unit 114.
  • the association between the operation mode (three kinds of initial movement directions) and the parameter type is stored in a storage medium such as a volatile memory, HDD, or SSD included in the memory 2 (FIG. 2). Further, the user may be able to manually set the association between the operation mode (three types of initial movement directions) and the parameter type.
  • a guide 134a indicating the rotation direction is displayed as shown in FIG.
  • the CPU 1 sets the ratio value according to the rotational movement amount of the touch operation.
  • the display position of the virtual operation object 131 is updated (step S27).
  • the rotation of the touch operation may be centered on the approximate center position of the contact points 133a, 133b, and 133c, or may be centered on any one of the contact points 133a, 133b, and 133c. . For example, when the user rotates the touch operation in the right direction, as shown in FIG.
  • the virtual operation object 131 (circular knob portion image) rotates in the right direction, and the ratio value increases.
  • the rotation position of the virtual operation object 131 is indicated by a mark 135.
  • the CPU 1 updates the display of the graph 132 indicating the characteristics of the compressor according to the changed ratio value.
  • the guide 134b indicating the vertical direction is displayed.
  • the CPU 1 changes the threshold value according to the movement amount of the touch operation.
  • the display position of the virtual operation object 131 is updated (step S27). For example, when the user moves the touch operation downward, as shown in FIG. 10C, the virtual operation object 131 is moved downward and the threshold value becomes small.
  • the CPU 1 updates the display of the graph 132 indicating the characteristics of the compressor according to the changed threshold value.
  • a guide 134c indicating the left-right direction is displayed as shown in FIG.
  • the CPU 1 performs an attack time according to the movement amount of the touch operation.
  • the display position of the virtual operation object 131 is updated (step S27). For example, when the user moves the touch operation to the right, as shown in FIG. 10D, the virtual operation object 131 moves to the right, and the attack time value increases. Further, the CPU 1 updates the display of the graph 132 indicating the characteristics of the compressor according to the changed attack time.
  • the CPU 1 loops Steps S27 and S28, and each time the movement of the touch operation is detected, the amount of movement is increased.
  • the value of the parameter assigned to the virtual operation object 131 in step S24, S25, or S26 is changed (step S27).
  • the CPU 1 detects only the movement amount of the component in the corresponding direction. For example, when the threshold value is adjusted by moving the touch operation in the upper right direction, the threshold value is changed only in accordance with the amount of movement of the upward component.
  • Steps S27 and S28 are looped until all the contact points 133a, 133b, and 133c are separated from the touch panel 3, so that the user touches at least one of the fingers performing the touch operation on the touch panel 3 (FIG. 2). Then, the parameter value can be changed by moving the finger.
  • Step S28 the CPU 1 ends the process of editing the parameter value of the virtual operation object 131b (Step S29). After step S21, the CPU 1 may end the process of FIG. 11 when all the fingers performing the touch operation are separated from the touch panel 3 at any timing.
  • one virtual operation object 131 can be associated with three types of parameters: ratio, threshold, and attack time (that is, three types of parameters can be adjusted with one virtual operation object 131).
  • the screen display can be simplified as compared with the configuration in which the virtual operation object is displayed for each parameter type. Therefore, an operation interface that is easy for the user to handle can be provided. Further, only by properly using the touch operation on the virtual operation object 131, that is, the touch operation form (initial movement direction) for changing the parameter value, the parameter type assigned to the virtual operation object 131 is changed or determined. be able to. Since there is no need to perform an operation for changing the parameter type separately from the parameter value changing operation, the parameter type can be easily changed or determined.
  • the parameter type assigned to the virtual operation object 131 is fixed according to the initial movement direction (the loop of steps S27 and S28), so that the user may erroneously operate other types of parameters. There is no.
  • FIG. 12 is a flowchart illustrating a processing example according to a touch operation according to another embodiment.
  • the CPU 1 detects a contact point of one or more fingers performing a touch operation using the detection mechanism of the touch panel 3 (step S30).
  • step S31 the CPU 1 determines “operation form” based on the detected number of contact points (finger). As an example, the CPU 1 determines whether the number of fingers performing a touch operation is one, two, or three. As an example, when the number of contact points is three or more, the CPU 1 may regard the number of fingers performing a touch operation as three.
  • the CPU 1 changes the parameter type assigned to the virtual operation object 131 according to the determined operation mode and the number of fingers (contact points) performing the touch operation.
  • the CPU 1 assigns a threshold to the virtual operation object 131 when the number of contact points is one (step S32), and assigns a ratio to the virtual operation object 131 when the number of contact points is two (step S33). ) If there are three contact points, an attack time is assigned to the virtual operation object 131 (step S34).
  • the moving direction of the touch operation may be determined for each parameter type assigned to the virtual operation object 131.
  • the CPU 1 may display a guide indicating the moving direction of the touch operation according to the parameter type.
  • FIGS. 13A to 13D are display examples in the case where the parameter assigned to the virtual operation object 131 is changed according to the number of contact points, and FIG. 13A shows that no touch operation is performed. Indicates the state.
  • symbol as FIG. 10 is provided to the component which is common in FIG.
  • a guide 134b indicating the vertical direction is displayed.
  • the CPU 1 changes the threshold value according to the amount of movement of the touch operation, and virtually The display position of the operation object 131 is updated (step S35). For example, when the user moves the touch operation downward, as shown in FIG. 13B, the virtual operation object 131 moves downward and the threshold value decreases.
  • the CPU 1 updates the display of the graph 132 indicating the characteristics of the compressor according to the changed threshold value.
  • a guide 134a indicating the rotation direction is displayed as shown in FIG.
  • the CPU 1 changes the ratio value according to the rotation amount of the touch operation and performs virtual operation.
  • the display position of the object 131 is updated (step S35). For example, when the user rotates the touch operation to the right, as shown in FIG. 13C, the virtual operation object 131 rotates to the right and the ratio value increases.
  • the CPU 1 updates the display of the graph 132 indicating the characteristics of the compressor according to the changed threshold value.
  • a guide 134c indicating the left-right direction is displayed as shown in FIG.
  • the CPU 1 determines the attack time according to the movement amount of the touch operation.
  • the value is changed and the display position of the virtual operation object 131 is updated (step S35). For example, when the user moves the touch operation to the right, as shown in FIG. 13D, the virtual operation object 131 moves to the right, and the attack time value increases. Further, the CPU 1 updates the display of the graph 132 indicating the characteristics of the compressor according to the changed attack time.
  • step S37 the screen display can be simplified and the operation interface easy to handle for the user can be provided compared to the configuration in which the virtual operation object is displayed for each parameter type. it can. Further, the change / determination of the parameter type assigned to the virtual operation object 131 can be easily performed only by properly using the touch operation mode (number of fingers). Therefore, the parameter value control using the virtual operation object can be easily and easily performed.
  • FIG. 14 shows an example of an equalizer operation screen displayed on the screen 70 of the touch panel 3 as an example of a virtual operation object according to another embodiment, which is substantially the same as FIG. Similar to FIG. 7, the virtual operation object includes a characteristic curve 71 for adjusting the frequency characteristic of the acoustic signal and a plurality of control points 72a, 72b, 72c.
  • the CPU 1 determines the type of touch operation in steps S21 to S26 or steps S30 to S34, and according to the type of operation. Any one of three types of parameters, center frequency, gain, and Q is assigned to the virtual operation object 71.
  • the CPU 1 changes the parameter type assigned to the virtual operation object 71 according to the number of fingers of the touch operation. For example, the CPU 1 assigns “center frequency” to the virtual operation object 71 when the number of fingers performing the touch operation is one, “Q” when there are two fingers, and “gain” when there are three fingers.
  • FIG. 14 shows an example in which the control point 72b is operated with three fingers. Further, the CPU 1 may display a guide 74 indicating the moving direction (vertical direction) of the touch operation for operating the gain value.
  • the virtual operation object 131 may have any shape.
  • the operation amount detected according to the movement of the touch operation is not limited to the movement direction and the movement distance, but any physical amount such as the movement speed, acceleration, time, or pressure of the touch operation, or Any combination of these various physical quantities may be used.
  • the steps S28 and S36 further determine the presence or absence of a predetermined operation such as, for example, slightly vibrating the touch operation (shaking the contact point slightly), and the predetermined operation was performed.
  • a predetermined operation such as, for example, slightly vibrating the touch operation (shaking the contact point slightly)
  • the process may return to step S23 or S31.
  • the user can re-change (re-determine) the parameters assigned to the virtual operation object 131 without releasing all touch points from the touch panel 3 and performing a new touch operation. For example, the user can change the value of the ratio by rotating the touch operation in the initial movement, then perform the predetermined operation, and then change the threshold value by moving the touch operation up and down.
  • the steps S28 and S36 are further configured to determine whether the touch operation mode has been changed and to return to step S23 or S31 when the operation mode has been changed. May be. Also in this case, the user can re-change (re-determine) the parameters assigned to the virtual operation object 131 without releasing all touch points from the touch panel 3 and performing a new touch operation. For example, the user can change the threshold value by moving the touch operation in the vertical direction after changing the ratio value by rotating the touch operation in the initial movement.
  • the steps S28 and S36 branch to Yes when any one contact point leaves the touch panel 3, and the parameter editing process is terminated (the steps S28 and S37). It may be configured.
  • the parameter value changed in steps S27 and S35 is a parameter assigned to the virtual operation object 131 according to the movement amount, regardless of the movement direction of the touch operation.
  • the value of may be changed.
  • the ratio value is changed only in accordance with the amount of movement in the rotational direction.
  • the ratio value is changed according to the amount of movement in any direction.
  • a plurality of virtual operation objects 131 may be displayed on the touch panel 3.
  • one virtual operation object 131 is selected by the touch operation detected in steps S21 and S30.
  • the CPU 1 selects the virtual operation object 131 that is closest to the center position of one or more contact points.
  • the mode of operation for selecting the virtual operation object 131 is determined (steps S23 and S31).
  • the steps S23 and S31 are configured to determine the movement of the center of gravity of a plurality of contact points and change the parameter type assigned to the virtual operation object 131 according to the movement of the center of gravity. It's okay.
  • the movement of the center of gravity is, for example, to move only a part of a plurality of contact points.
  • a ratio is assigned to the virtual operation object 131, and when the movement of the center of gravity is moved upward and downward by a predetermined distance from the initial position, A threshold is assigned to the operation object 131, and an attack time is assigned to the virtual operation object 131 when the movement of the center of gravity moves to the left or right by a predetermined distance or more from the initial position.
  • the device that inputs an operation related to the virtual operation object 131 is a movement of the user's hand. It may be configured by a motion capture device configured to detect. In this case, the determination of the operation form in steps S23 and S31 determines the movement of the user's hand detected by the motion capture device.
  • the virtual operation object 131 may be displayed by, for example, a head mounted display. In the present invention, it is possible to easily and easily control the value of a parameter using a virtual operation object. For example, use of a value change operation in an augmented reality environment using a motion capture device and a head-mounted display. It is suitable for.
  • the setting device 101 is not limited to one that adjusts parameter values related to acoustic signal processing. Instead, any type of value adjustment may be used as long as the value is adjusted by operating a virtual operation object, such as adjusting the brightness of illumination.
  • the function executed by the determination unit 113a and the assignment unit 114a in FIG. 8 showing a modification of the setting device 10 according to the first aspect is illustrated in FIG. 9 may be modified so as to be replaced by functions executed by the determination unit 113 and the allocation unit 114 in FIG.
  • the processes of steps S15, S16, and S17 in FIG. 6 may be replaced with the same processes as steps S31, S32, S33, and S34 shown in FIG.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • General Physics & Mathematics (AREA)
  • Acoustics & Sound (AREA)
  • Signal Processing (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

L'invention concerne un dispositif de réglage 10 qui est pourvu : d'un dispositif d'affichage 11 pour afficher des objets d'opération virtuelle ; d'une unité de détection 12 pour détecter une opération associée à un objet d'opération virtuelle effectuée par un utilisateur ; d'une première unité de commande d'affichage 13 pour changer le format d'affichage de l'objet dans un premier format, en réponse à une première opération effectuée par l'utilisateur ; d'une seconde unité de commande d'affichage 14 pour changer le format d'affichage de l'objet dans un second format en réponse à une deuxième opération effectuée par l'utilisateur après le changement du premier format ; d'une unité de changement de valeur 15 pour changer la valeur d'un paramètre correspondant à l'objet d'opération virtuelle en réponse à une troisième opération effectuée par l'utilisateur après le changement du second format d'affichage.
PCT/JP2017/010880 2016-03-25 2017-03-17 Dispositif et procédé pour établir un paramètre WO2017164107A1 (fr)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
JP2016-062826 2016-03-25
JP2016062826A JP2017174361A (ja) 2016-03-25 2016-03-25 設定装置及び方法
JP2016062827A JP2017174362A (ja) 2016-03-25 2016-03-25 設定装置及び方法
JP2016-062827 2016-03-25

Publications (1)

Publication Number Publication Date
WO2017164107A1 true WO2017164107A1 (fr) 2017-09-28

Family

ID=59900366

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2017/010880 WO2017164107A1 (fr) 2016-03-25 2017-03-17 Dispositif et procédé pour établir un paramètre

Country Status (1)

Country Link
WO (1) WO2017164107A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020003645A1 (fr) * 2018-06-26 2020-01-02 株式会社日立製作所 Système de simulation et son procédé de commande

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004070492A (ja) * 2002-08-02 2004-03-04 Hitachi Ltd タッチパネルを備えた表示装置及び情報処理方法
JP2005108211A (ja) * 2003-09-16 2005-04-21 Smart Technol Inc ジェスチャ認識方法及びそれを組み込んだタッチシステム
JP2007267135A (ja) * 2006-03-29 2007-10-11 Yamaha Corp パラメータ編集装置及び信号処理装置
JP2013011983A (ja) * 2011-06-28 2013-01-17 Kyocera Corp 電子機器、制御方法および制御プログラム

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004070492A (ja) * 2002-08-02 2004-03-04 Hitachi Ltd タッチパネルを備えた表示装置及び情報処理方法
JP2005108211A (ja) * 2003-09-16 2005-04-21 Smart Technol Inc ジェスチャ認識方法及びそれを組み込んだタッチシステム
JP2007267135A (ja) * 2006-03-29 2007-10-11 Yamaha Corp パラメータ編集装置及び信号処理装置
JP2013011983A (ja) * 2011-06-28 2013-01-17 Kyocera Corp 電子機器、制御方法および制御プログラム

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020003645A1 (fr) * 2018-06-26 2020-01-02 株式会社日立製作所 Système de simulation et son procédé de commande
JP2020003565A (ja) * 2018-06-26 2020-01-09 株式会社日立製作所 シミュレーションシステム及びその制御方法

Similar Documents

Publication Publication Date Title
EP2557492B1 (fr) Procédé d'entrée et de sortie dans un terminal à écran tactile et appareil correspondant
EP2682853B1 (fr) Dispositif mobile et commande de procédé de fonctionnement disponibles pour l'utilisation de la fonction toucher-déposer
JP5771936B2 (ja) パラメータ調整装置及び音響調整卓
US20140223299A1 (en) Gesture-based user interface method and apparatus
US9857948B2 (en) Method, apparatus and computer-readable storage means for adjusting at least one parameter
EP2196890A2 (fr) Procédé pour la fourniture d'une interface utilisateur graphique et dispositif électronique l'utilisant
KR101930225B1 (ko) 터치스크린 동작모드의 제어방법 및 제어장치
EP2180400A2 (fr) Appareil de traitement d'images numériques, procédé de traitement d'images numériques et programme
US20150261432A1 (en) Display control apparatus and method
JP2015002840A (ja) 電子ゲーム機、電子ゲーム処理方法及び電子ゲームプログラム
CN105824493A (zh) 一种移动终端的控制方法及移动终端
WO2017164107A1 (fr) Dispositif et procédé pour établir un paramètre
US20150143295A1 (en) Method, apparatus, and computer-readable recording medium for displaying and executing functions of portable device
JP5682285B2 (ja) パラメータ設定プログラム及び電子音楽装置
WO2014034549A1 (fr) Dispositif de traitement d'informations, procédé de traitement d'informations, programme, et support de stockage d'informations
JP2017174361A (ja) 設定装置及び方法
JP5246974B2 (ja) 電子機器の入力装置及び入力操作処理方法、並びに入力制御プログラム
JP2017174362A (ja) 設定装置及び方法
JP2007188289A (ja) マルチタスク処理端末装置
JP2017174363A (ja) 設定装置及び方法
TWI403932B (zh) 操作觸控螢幕之方法、於觸控螢幕定義觸控手勢之方法及其電子裝置
CN109804342B (zh) 用于调整图形用户接口的显示和操作的方法
JP2011107911A (ja) プログラム、情報処理装置、及び情報処理システム
JP2015002980A (ja) 電子ゲーム機、電子ゲーム処理方法及び電子ゲームプログラム
JP2024511304A (ja) 状態ベースのアクションボタン

Legal Events

Date Code Title Description
NENP Non-entry into the national phase

Ref country code: DE

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17770149

Country of ref document: EP

Kind code of ref document: A1

122 Ep: pct application non-entry in european phase

Ref document number: 17770149

Country of ref document: EP

Kind code of ref document: A1