WO2016104266A1 - 電子機器および電子機器の制御方法 - Google Patents
電子機器および電子機器の制御方法 Download PDFInfo
- Publication number
- WO2016104266A1 WO2016104266A1 PCT/JP2015/085153 JP2015085153W WO2016104266A1 WO 2016104266 A1 WO2016104266 A1 WO 2016104266A1 JP 2015085153 W JP2015085153 W JP 2015085153W WO 2016104266 A1 WO2016104266 A1 WO 2016104266A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- unit
- electric field
- mode
- control
- gesture
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/044—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by capacitive means
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04R—LOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
- H04R3/00—Circuits for transducers, loudspeakers or microphones
Definitions
- This technology relates to an electronic device and a control method of the electronic device, and relates to an electronic device having a user interface function by a gesture operation.
- Patent Document 1 describes an electronic device using user interaction and a control method thereof in order to realize a more intuitive operation.
- the purpose of the present technology is to enable the user to operate the electronic device with a gesture operation.
- the concept of this technology is A sensor unit for detecting a user's gesture operation;
- the electronic apparatus includes a control unit that performs mode switching control and function control in each mode based on the detection output of the sensor unit.
- the user's gesture motion is detected by the sensor unit.
- the gesture operation include a gesture operation on a plane that touches the operation area surface and a gesture operation on a space that does not touch the operation area surface.
- the control unit performs mode switching control and function control in each mode based on the detection output of the sensor unit. For example, level control related to a predetermined function may be included as function control in each mode, and the direction of movement of the gesture motion above and below this level may be correlated with the top and bottom of this level. By taking the correlation in this way, an intuitive operation of the electronic device by the gesture operation becomes possible.
- the gesture operation may be a clockwise and counterclockwise rotation operation, or a flick operation in one direction and the other direction.
- level control related to a predetermined function is playback sound volume control, effect effect control on playback sound, lighting light amount control, or playback song feed and return control. May be.
- the gesture operation may include a tap operation
- the control unit may control the mode switching based on the detection output of the tap operation.
- mode switching is performed by a tap operation, so that it is possible to cope with an increase in the number of modes to be switched.
- the mode may include at least one of a sound effect mode, a lighting mode, and a sampler mode in addition to the traveling system mode.
- the mode switching and the function in each mode are controlled according to the gesture operation of the user. Therefore, the user can favorably operate the electronic device by the gesture operation.
- an operation button for operating on / off of the control based on the detection output of the sensor unit in the control unit may be further provided.
- an operation button for operating on / off of the control based on the detection output of the sensor unit in the control unit.
- the sensor unit includes a sensor substrate having an electric field generation unit and an electric field reception unit, which is disposed immediately below the operation area surface on the upper surface of the housing, and an electric field reception signal obtained by the electric field reception unit of the sensor substrate.
- a signal processing unit that detects a gesture operation may be provided, and a light-emitting unit that indicates the operation area surface by light emission when the operation button is turned on may be further provided. By providing such a light emitting unit, the user can recognize the operation area surface of the gesture operation, and can appropriately perform the gesture operation.
- the sensor substrate has a circular shape
- the sensor substrate has an electric field generating unit on the back surface, and an electric field receiving unit on the outer peripheral position on the top, bottom, left, and right of the surface.
- a light emitting element substrate having a light emitting element at a position corresponding to the through hole may be provided as a light emitting part, having a through hole penetrating the center of the receiving part.
- the sensor unit is disposed immediately below the operation area surface on the upper surface of the housing, and includes a sensor substrate having an electric field generation unit and an electric field reception unit, and an electric field reception obtained by the electric field reception unit of the sensor substrate.
- a signal processing unit that detects a gesture motion from the signal may be provided, and the operation area surface may be inclined at a predetermined angle in a predetermined direction from the horizontal plane.
- the sensor unit includes a sensor substrate having an electric field generation unit and an electric field reception unit, a signal processing unit that detects a gesture operation from an electric field reception signal obtained by the electric field reception unit of the sensor substrate, and the signal A transmission unit that transmits the detection output of the processing unit as a remote control signal may be provided, and the sensor unit may be detachably disposed on the upper surface of the housing.
- the user can favorably operate the electronic device by the gesture operation.
- the effects described in the present specification are merely examples and are not limited, and may have additional effects.
- FIG. 10 is a diagram for explaining that a gesture command table (POP advertisement) is leaned on the rear side of the upper surface of the housing 101; It is an image figure of an example of a command table.
- POP advertisement a gesture command table
- FIG. 1 is a perspective view showing an appearance of a one-box type audio apparatus 10 as an embodiment.
- FIG. 2 is a top view of the audio device 10.
- FIG. 3 is a right side view of the audio device 10.
- This audio device 10 has a substantially rectangular parallelepiped casing 101.
- a speaker unit 102 such as a tweeter, a mid range, or a woofer is disposed on the front surface of the housing 101.
- a CD (Compact® Disc) insertion slot 103 is provided on the upper front side of the housing 101.
- a light emitting unit 104 including a multi-color LED for floor lighting is disposed on the lower front side of the housing 101.
- various operation keys for the user to perform operations are disposed on the top of the housing 101.
- a circular gesture operation area surface 106 for a user to perform a gesture operation is provided at the center of the upper surface of the housing 101.
- a sensor substrate (not shown in FIG. 1) that constitutes a sensor unit for detecting a gesture operation is disposed below the gesture operation area surface 106.
- the gesture operation area surface 106 is inclined by a predetermined angle ⁇ from a horizontal plane in a predetermined direction, in this embodiment, in the front-rear direction.
- hatched area 106a schematically shows a range in which the user can perform a gesture operation.
- the gesture operation includes a gesture operation on a plane that touches the gesture operation area surface 106 and a gesture operation on a space that does not touch the gesture operation area surface 106.
- a tap operation is used as a gesture operation on a plane
- a flick operation and a rotation operation are used as a gesture operation on a space.
- the user can operate in multiple modes with gesture operation.
- the user can perform a function operation in each mode by a gesture operation. Details of each mode will be described later.
- the plurality of modes are not limited to these four modes.
- at least one of a sound effect mode, a lighting mode, and a sampler mode may be included.
- mode switching is also performed by a gesture operation.
- a tap operation is used as the gesture operation for mode switching.
- mode switching is performed by a tap operation, so that it is possible to cope with an increase in the number of modes to be switched.
- a display unit 107 for displaying a current mode in which an operation can be performed by a gesture operation is provided at the rear of the gesture operation area surface 106 on the upper surface of the housing 101.
- the display unit 107 displays the current mode among a plurality of modes that the user can operate with the gesture operation.
- the display unit 107 describes a plurality of mode names that can be changed, and a mode name portion corresponding to the current mode is irradiated from below.
- a configuration in which a mode name corresponding to the current mode is displayed using a liquid crystal display element or the like is also conceivable.
- a display unit 108 is provided on the rear end side of the upper surface of the housing 101.
- the display unit 108 is configured using, for example, a liquid crystal display element. For example, when the user performs a function operation in each mode by a gesture operation, the display unit 108 displays what function operation is performed as necessary. For example, in the sound effect mode, whether the effect function is on, and what kind of effect is displayed when it is on, are displayed.
- a light emitting unit 109 including a multi-color LED for wall / ceiling lighting is disposed on the rear end side of the upper surface of the housing 101 .
- a dial operation unit 110 for selecting one light emitting pattern from a plurality of light emitting patterns is provided.
- FIG. 5 shows an example of light emission from the light emitting unit 104 for floor lighting and light emission from the light emitting unit 109 for wall / ceiling lighting.
- the light emission unit 109 emits light with a point light emission pattern.
- light emission with a rod-like light emission pattern similar to the light emission from the light emission unit 104 is also possible. It is.
- the dots and bars are shown in black and white, but in actuality, they are color patterns of various colors.
- FIG. 6A shows a top view of the sensor substrate 111 disposed below the gesture operation area surface 106.
- 6B is a cross-sectional view taken along the line AA ′, and
- FIG. 6C is a bottom view of the sensor substrate 111.
- the sensor substrate 111 is formed in a circular shape. In this case, it is possible to efficiently correspond to the circular gesture operation area surface 106 described above, and it is possible to suppress the generation of useless portions and reduce costs.
- the sensor substrate 111 has an electric field generating unit 113 on the back surface of a circular substrate body 112 and an electric field receiving unit 114 on the front surface.
- the electric field generator 113 is composed of patterned electrodes, and an electric field (quasi-static near electric field) is generated by applying an AC voltage to the electrodes.
- the electric field receiver 114 is an electrode that receives the electric field generated by the electric field generator 113.
- four arc-shaped electric field receiving units 114 are provided at positions on the outer peripheral side in the vertical and horizontal directions. That is, the left waist electric field receiving unit 114w, the lower south electric field receiving unit 114s, the right east electric field receiving unit 114e, and the upper north electric field receiving unit 114n.
- FIG. 6D schematically shows how the electric field 115 generated by the electric field generator 113 is received by the electric field receiver 114.
- FIG. 7 shows the configuration of the sensor unit.
- the sensor unit includes the above-described sensor substrate 111 and a DSP (digital signal processor) 116 that detects a gesture operation from an electric field reception signal obtained by each electric field reception unit 114 of the sensor substrate 111.
- the electric field is distorted in accordance with the user's gesture operation, and accordingly, the electric field reception signal obtained by each electric field receiving unit 114 changes.
- the DSP 116 analyzes the electric field reception signal obtained by each electric field reception unit 114 and detects what gesture operation has been performed. This detection output is sent to the CPU 117. By adopting a configuration in which the gesture operation is detected by the DSP 116 as described above, it is possible to avoid a large processing load on the CPU 117.
- FIG. 8A schematically shows electric field distortion that occurs when the user taps the waist area of the sensor substrate 111.
- the electric field generated there is blocked by the hand and distorted, and the DSP 116 detects that the tapping operation is to the waist area of the sensor substrate 111.
- FIG. 8B schematically shows electric field distortion when the user performs a flicking operation in the right direction. By this flicking operation, the electric field at the location where the user's hand passes is sequentially blocked by the hand and distorted, and the DSP 116 detects that the flicking operation is in the right direction.
- a tap operation is used as a gesture operation on a plane, and a flick operation and a rotation operation are used as a gesture operation on a space.
- the tap operation the tap operation to the west area shown in FIG. 9A and the tap operation to the east area shown in FIG. 9B are used.
- the flicking operation includes a flicking operation in the right direction shown in FIG. 10A, a flicking operation in the left direction shown in FIG. 10B, and a flicking operation in the upward direction shown in FIG. 10C. Then, the downward flick operation shown in FIG. 10D is used.
- the sensor substrate 111 has a through hole 118 that penetrates the center of the electric field receiving unit 114 in the vertical and horizontal directions.
- the through hole 118 has an arc shape similar to the shape of the electric field receiving unit 114.
- the through-hole 118 allows light from the light emitting element disposed on the upper surface of the light emitting element substrate 119 disposed below the sensor substrate 111, in this embodiment, light from the LED 120 to pass upward. It is provided for.
- the light emitting element substrate 119 can be disposed below the sensor substrate 111. Accordingly, the sensor substrate 111 can be disposed closer to the gesture operation area surface 106, and the gesture operable range on the space extending in the vertical direction from the gesture operation area surface 106 can be extended by that amount.
- the LED 120 of the light emitting element substrate 119 emits light weakly, and the position of the gesture operation area surface 106 is indicated by light emission.
- the LED 120 of the light emitting element substrate 119 performs light emission corresponding to the content of the gesture operation, that is, reaction light emission.
- FIG. 12 shows reaction light emission in the tap operation.
- the illustrated example is a tap operation to the waist area.
- the light emission of the LED 120 corresponding to the waist electric field receiving unit 114w becomes strong, indicating that the user has accepted the tap operation to the waist area.
- the detailed description is omitted, the same applies to the tap operation to the east area.
- FIG. 13 shows reaction light emission in the flick operation.
- the illustrated example is a flicking operation in the right direction.
- the portion that emits light strongly moves along with the movement of the user's hand. That is, when the hand is on the left side, the LED 120 corresponding to the waist electric field receiving unit 114w emits light strongly, and when the hand moves to the center, the LED 120 corresponding to the north electric field receiving unit 114n and the south electric field receiving unit 114s emits light.
- the light emission of the LED 120 corresponding to the yeast electric field receiving unit 114e becomes strong. This indicates that the user has accepted a flicking operation in the right direction.
- the detailed description is omitted, the same applies to the flicking operation in the other direction.
- FIG. 14 shows reaction light emission in the rotation operation.
- the illustrated example is a clockwise (clockwise) rotation operation.
- the portion that emits light strongly moves along with the movement of the user's hand. That is, when the hand is on the left side, the light emission of the LED 120 corresponding to the waist electric field receiving unit 114w is strong, and when the hand moves upward, the light emission of the LED 120 corresponding to the north electric field receiving unit 114n is strong, and the hand is on the right side.
- the LED 120 corresponding to the east electric field receiving unit 114e emits light, and when the hand moves downward, the LED 120 corresponding to the south electric field receiving unit 114s emits light.
- the detailed description is omitted, the same applies to the counterclockwise (counterclockwise) rotation operation.
- the light emitting operation in the light emitting element substrate 119 is controlled by the CPU 117 as shown in FIG.
- the CPU 117 controls the reaction light emission operation based on the detection output of the gesture operation sent from the DSP 116.
- reaction to the gesture operation is performed by light emission as described above, and in this embodiment, a reaction by pronunciation is also performed.
- This reaction by pronunciation is performed for a gesture operation that does not change the output sound. That is, a reaction only by light emission is performed for a gesture operation that changes the output sound.
- the reaction by sound generation is performed by the CPU 117 generating a beep sound signal from a beep sound generation unit and supplying it to a predetermined speaker unit 102.
- the CPU 117 reads out a beep sound source from a memory unit (not shown in FIG. 15) and generates a beep sound signal.
- FIG. 16 shows an example of gesture operation allocation for each mode.
- the traveling system mode is a mode that enables basic operations of the audio device 10 (song playback, stop, volume increase / decrease, etc.).
- the sound effect mode is a mode that enables switching of signal processing that has an effect of causing a sound such as delay or echo to be applied to the music being played and the intensity of the effect.
- the lighting mode is a mode that makes it possible to control the light emission patterns and light amounts of the light emitting units 104 and 109.
- the sampler mode is a mode that enables sound synthesis of a predetermined sound source.
- the mode is switched by a tap operation.
- the transition to the sampler mode is performed by the waist area tap operation (see FIG. 9A), and the sound effect mode is performed by the east area tap operation (see FIG. 9B). Transition to is made.
- the transition to the traveling system mode is performed by the west area tap operation, and the transition to the lighting mode is performed by the east area tap operation.
- the transition to the sound effect mode is performed by the west area tap operation
- the transition to the sampler mode is performed by the east area tap operation.
- the transition to the lighting mode is performed by the west area tap operation
- the transition to the traveling system mode is performed by the east area tap operation.
- the music return operation is performed by the flicking operation in the left direction (see FIG. 10B), and the music feeding operation is performed by the flicking operation in the right direction (see FIG. 10A).
- a playback operation is performed by an upward flick operation (see FIG. 10C), and a stop operation is performed by a downward flick operation (see FIG. 10D).
- the volume up operation is performed by the clockwise (rightward) rotation operation (see FIG. 9C), and the volume down operation is performed by the counterclockwise (leftward) rotation operation (see FIG. 9D). Is made.
- the use state of effect 1 is operated by the flicking operation in the left direction
- the use state of effect 2 is operated by the flicking operation in the right direction
- the use state of effect 3 is operated by the flicking operation in the upward direction.
- the effect 4 is operated to be used by a downward flick operation.
- the effect turning-off operation is performed by a tap operation.
- the effect effect up operation is performed by a clockwise (clockwise) rotation operation
- the effect effect down operation is performed by a counterclockwise (counterclockwise) rotation operation.
- FIG. 17 shows an example of a sound effect.
- the isolator is an effect that performs sound processing such as strengthening or weakening the sound in a specific band such as high range / mid range / low range.
- Flanger is an effect that creates a “shwashwa” sound by interfering with the sound that is being played normally and a slightly delayed sound.
- Wah is an effect that creates a sound that sounds like a “wah-wah” by filtering a time-varying filter on the sound.
- Pan is an effect that distributes sound data into L / R components and performs sound processing to manipulate the sound image.
- the light emission example shown in FIG. 5 shows an example in which lighting is performed in all directions. For example, from lighting in all directions, lighting only in the right direction ⁇ lighting only in the middle direction ⁇ only left direction Is turned on, and finally it is returned to lighting in all directions. Further, the lighting is operated from left to right by flicking in the right direction. In this case, instead of lighting and non-lighting, the direction of strong light emission or the light emission direction of a specific color may be moved along with the flick operation.
- both the light emission from the light emitting unit 104 for floor lighting and the light emission from the light emitting unit 109 for wall / ceiling lighting are performed. It is operated to light emission, and finally it returns to both light emission. Further, the lighting operation is performed only in the downward direction by the flicking operation in the downward direction. In this case, the light emission amount of the light emitting unit 109 or the light emitting unit 104 may be changed instead of lighting or non-lighting.
- the light amount is increased by a clockwise (clockwise) rotation operation, and the light amount is decreased by a counterclockwise (counterclockwise) rotation operation.
- the sound 1 (sound source 1) is manipulated by flicking to the left
- the sound 2 (sound source 2) is manipulated by flicking to the right
- the sound 3 (sound source 3) is flicked upward. )
- the sound 4 (sound source 4) is operated in the synthesized state by a flicking operation in the downward direction
- the sound 5 (sound source 5) is operated in the synthesized state by rotating in the clockwise direction (clockwise) and counterclockwise (counterclockwise).
- the sound 6 (sound source 6) is manipulated by the rotation operation.
- Each sound synthesized in the sampler mode in this way is synthesized and output to the song when the song is in the playback state. However, when the song is not in a playback state, it is output alone.
- Each sound used in the sampler mode may be one stored in a memory as a sound source storage unit, or may be generated by, for example, a voice synthesis function possessed by the CPU 117. Further, each sound held in the memory may be fixed, or the user may be able to store a desired sound.
- FIG. 18 shows a circuit configuration example of the audio device 10.
- the audio device 10 includes a CPU 117, a key input unit 132, a sensor substrate 111, a DSP 116, a light emitting element substrate 119, a memory unit 134, display units 107 and 108, light emitting units 104 and 109, and an amplifier 133.
- the speaker unit 102 is included.
- the CPU 117 controls the operation of each part of the audio device 10.
- the CPU 117 includes a system control unit 121, a reproduction control unit 122, and an audio processing unit 123.
- a key input unit 132 is connected to the system control unit 121.
- the key input unit 132 includes keys for performing basic operations (song playback, stop, volume increase / decrease, etc.) of the audio device 10, and a gesture function for selecting on / off of the above-described gesture function.
- a selection key 105 is also included.
- the system control unit 121 is connected to a sensor unit including a sensor substrate 111 and a DSP 116 that detects a user's gesture operation. That is, a gesture operation detection output is supplied from the DSP 116 to the system control unit 121.
- a light emitting element substrate 119 is connected to the system control unit 121.
- the light emitting operation on the light emitting element substrate 119 is controlled by the system control unit 121.
- the LED 120 of the light emitting element substrate 119 emits light weakly, and the position of the gesture operation area surface 106 is indicated by light emission (see FIG. 11).
- the LED 120 of the light emitting element substrate 119 is controlled to emit light according to the content of the gesture operation, that is, reaction light emission (see FIGS. 12 to 14).
- a memory unit 134 is connected to the system control unit 121. For example, each sound (each sound source) used in the sampler mode described above is held in the memory unit 134.
- the memory unit 134 holds a beep sound source.
- display units 107 and 108 are connected to the system control unit 121. Accordingly, the display of the display units 107 and 108 is controlled by the system control unit 121 according to, for example, a gesture operation or a key operation. Further, the light emitting units 104 and 109 are connected to the system control unit 121. Thereby, the light emission of the light emitting units 104 and 109 is controlled by the system control unit 121 according to, for example, a gesture operation or a key operation.
- the sound source unit 131 includes, for example, a CD playback unit, a USB terminal, a Bluetooth receiving unit, a tuner, a microphone terminal, and the like. “BLUETOOTH” is a registered trademark.
- the sound source unit 131 is connected to the reproduction control unit 122. Under the control of the reproduction control unit 122, a sound source selected by the user, for example, by a key operation is sent to the sound processing unit 123 as a reproduction sound source.
- song return or song feed is performed by the user's key operation or gesture operation.
- the gesture function is turned on, and in the state of the traveling system mode, the music is returned and music is fed by performing the flicking operation in the left direction or the right direction (FIG. 16). reference).
- the audio processing unit 123 performs various processes on the reproduction sound source sent from the reproduction control unit 122. For example, volume up / down processing is performed by a user's key operation or gesture operation. Also, for example, effect processing corresponding to the gesture operation is performed on the reproduction sound source by a gesture operation (sound effect mode).
- a process of synthesizing a sound source (sound) corresponding to the gesture operation is performed on the reproduction sound source by a gesture operation (sampler mode).
- the system control unit 121 reads out a sound source (sound) corresponding to the gesture operation from the memory unit 134 and sends it to the sound processing unit 123.
- the amplifier 133 amplifies the sound source (audio data) processed by the audio processing unit 123 and supplies the amplified sound source to the speaker unit 102.
- the speaker unit 102 outputs sound from the sound source (audio data) from the amplifier 133.
- each memory (each sound source) used in the sampler mode is held in the memory unit 134.
- Each sound may be fixed, but a user can also store a desired sound.
- a desired sound is extracted from the sound source unit 131 via the reproduction control unit 122 by the system control unit 121 by a user key operation, and is stored and held in the memory unit 134.
- the flowchart in FIG. 19 shows an example of a control procedure in the system control unit 121 when the gesture function is turned on.
- the system control unit 121 starts the control process when the gesture function is turned on in step ST1.
- the first mode is the traveling system mode.
- the system control unit 121 determines whether or not a tap operation is detected in step ST2. When the tap operation is not detected, the system control unit 121 determines whether or not a flick operation or a rotation operation is detected in step ST3. When the flick operation or the rotation operation is not detected, the system control unit 121 returns to the process of step ST2.
- step ST2 When the tap operation is detected in step ST2, the system control unit 121 switches the mode (mode transition) in step ST4 (see FIG. 16), and then returns to the process of step ST2.
- step ST3 When a flick operation or a rotation operation is detected in step ST3, the system control unit 121 performs function control according to the operation in step ST5 (see FIG. 16), and then returns to the processing in step ST2.
- the mode switching and the function in each mode are controlled in accordance with the user's gesture operation. Therefore, the user can favorably operate the audio device by the gesture operation.
- the audio device 10 shown in FIGS. 1 and 18 in the case of level control related to a predetermined function as function control in each mode, the direction of the movement of the gesture operation related to the upper and lower levels and the upper and lower levels Correlation is to be taken. Therefore, intuitive operation of the audio device 10 by gesture operation is possible.
- the mode is switched by a tap operation (see FIG. 16). Therefore, it is possible to cope with an increase in the number of modes to be switched.
- a gesture function selection key 105 for selecting on / off of the gesture function is provided (see FIG. 1). Therefore, only the key input operation is possible when the gesture function is off, and it is possible to avoid an unintended operation due to the user's operation when performing only the key operation.
- gesture function when the gesture function is on, operation by key input and gesture operation is possible, and more convenient device operation can be provided for the user.
- the volume key +/ ⁇ can be used to increase / decrease the volume even when this sound effect is used.
- the gesture function when the gesture function is turned on, the position of the gesture operation area surface 106 is indicated by light emission (see FIG. 11). Therefore, the user can recognize the operation area surface of the gesture operation, and can appropriately perform the gesture operation.
- the gesture operation area surface 106 is inclined by a predetermined angle ⁇ from the horizontal plane in the front-rear direction (see FIG. 4). Therefore, the user can perform a gesture operation with a more natural movement.
- the sensor substrate 111 is circular. Therefore, it is possible to efficiently correspond to the circular gesture operation area surface 106, and it is possible to suppress the generation of useless portions and reduce costs.
- a through hole 118 is provided in the sensor substrate 111 (see FIG. 6). Therefore, the light emitting element substrate 119 can be disposed below the sensor substrate 111 (see FIG. 11). Accordingly, the sensor substrate 111 can be disposed closer to the gesture operation area surface 106, and the gesture operable range on the space extending in the vertical direction from the gesture operation area surface 106 can be extended by that amount.
- the gesture function is turned on and the sampler mode is set, so that the sound (sound source) can be synthesized by the gesture operation. Therefore, the user can perform intuitive sound synthesis.
- the sensor unit includes a sensor substrate 111, a DSP 116, and a transmission unit 141 that transmits a detection output of the DSP 116 as a remote control signal.
- the remote control signal for example, an infrared method or a wireless method is adopted as the remote control signal.
- the housing 101 of the audio device 10 is provided with a remote control signal receiving unit.
- the sensor unit Since the sensor unit has such a configuration, the sensor unit can be detached from the housing 101 and used as shown in FIG. In this case, the user can perform the gesture operation at a location away from the device main body.
- FIG. 23 is an image diagram of an example of a command table.
- this technique can also take the following structures.
- a sensor unit for detecting a user's gesture operation An electronic apparatus comprising a control unit that performs mode switching control and function control in each mode based on the detection output of the sensor unit.
- the level control related to a predetermined function is included as the function control in each mode, and the direction of movement of the gesture motion above and below the level is correlated with the top and bottom of the level.
- the level control related to the predetermined function is playback sound volume control, effect effect control on playback sound, lighting light amount control, or playback music feed and return control. Electronics.
- the gesture operation includes a tap operation, The electronic device according to any one of (1) to (4), wherein the control unit controls the mode switching based on a detection output of the tap operation.
- the electronic device according to (5) which includes at least one of a sound effect mode, a lighting mode, and a sampler mode in addition to the traveling system mode.
- the electronic device according to any one of (1) to (6) further including an operation button for operating on / off of control based on a detection output of the sensor unit in the control unit.
- the sensor part A sensor substrate having an electric field generating unit and an electric field receiving unit, which is arranged immediately below the operation area surface on the upper surface of the housing; A signal processing unit for detecting the gesture operation from an electric field reception signal obtained by the electric field reception unit of the sensor substrate;
- the electronic device further including: a light emitting unit that emits light when the operation button is turned on.
- the sensor substrate is circular,
- the sensor substrate has an electric field generating unit on the back surface and electric field receiving units on the upper, lower, left and right outer peripheral side positions,
- the sensor substrate has a through-hole penetrating the center of the upper, lower, left, and right electric field receiving units,
- the electronic device according to (8), wherein the light-emitting unit includes a light-emitting element substrate having a light-emitting element at a position corresponding to the through hole, which is disposed below the sensor substrate.
- the sensor part A sensor substrate having an electric field generating unit and an electric field receiving unit, which is arranged immediately below the operation area surface on the upper surface of the housing; A signal processing unit for detecting the gesture operation from an electric field reception signal obtained by the electric field reception unit of the sensor substrate;
- the electronic device according to any one of (1) to (9), wherein the operation area surface is inclined by a predetermined angle in a predetermined direction from a horizontal plane.
- the sensor unit A sensor substrate having an electric field generator and an electric field receiver; A signal processing unit for detecting the gesture operation from an electric field reception signal obtained by the electric field reception unit of the sensor substrate; A transmission unit for transmitting the detection output of the signal processing unit as a remote control signal;
- the electronic device according to any one of (1) to (10), wherein the sensor unit is detachably disposed on an upper surface of the housing.
- (12) a first step in which the sensor unit detects a user's gesture operation;
- the control method of an electronic device which has a control step in which a control part performs mode switching control and function control in each mode based on the user's gesture operation detected at the 1st step.
- DSP 117 ... CPU 118 ... Through hole 119 ...
- Light emitting element substrate 120 ... LED 121 .
- System control unit 122 ... Reproduction control unit 123 ... Audio processing unit 131 ... Sound source unit 132 ... Key input unit 133 ... Amplifier 140 ; Sensor unit 141 ... Transmission Part 150 ... Command table (pop advertisement)
Abstract
Description
ユーザのジェスチャ動作を検出するセンサー部と、
上記センサー部の検出出力に基づいて、モードの切り替え制御と各モードにおける機能制御を行う制御部を備える
電子機器にある。
1.実施の形態
2.変形例
[オーディオ機器の構成例]
図1は、実施の形態としてのワンボックス型のオーディオ機器10の外観を示す斜視図である。図2は、このオーディオ機器10の上面図である。図3は、このオーディオ機器10の右側面図である。このオーディオ機器10は、ほぼ直方体状の筐体101を有している。筐体101の前面には、トゥイーター、ミッドレンジ、ウーファーなどのスピーカーユニット102が配設されている。
図6(a)は、ジェスチャ操作エリア面106の下部に配置されるセンサー基板111の上面図を示している。図6(b)は、そのA-A´断面図、図6(c)はセンサー基板111の下面図を示している。
図16は、各モードに対するジェスチャ動作割り当ての一例を示している。この実施の形態においては、走行系モードと、サウンドエフェクトモード、ライティングモードおよびサンプラーモード(サウンド合成モード)の4つのモードが存在する。
図18は、オーディオ機器10の回路構成例を示している。オーディオ機器10は、CPU117と、キー入力部132と、センサー基板111と、DSP116と、発光素子基板119と、メモリ部134と、表示部107,108と、発光部104,109と、アンプ133と、スピーカーユニット102を有している。
なお、上述実施の形態において、センサー部が、筐体101に固定されている例を示した。しかし、このセンサー部がセンサーユニット140として、筐体101の上面に着脱自在に配置される構成も考えられる。この場合、センサー部は、図20に示すように、センサー基板111と、DSP116と、このDSP116の検出出力をリモートコントロール信号として送信する送信部141を備える構成となる。ここで、リモートコントロール信号としては、例えば、赤外線方式あるいは無線方式などが採用される。この場合、オーディオ機器10の筐体101には、リモートコントロール信号の受信部が備えられることとなる。
(1)ユーザのジェスチャ動作を検出するセンサー部と、
上記センサー部の検出出力に基づいて、モードの切り替え制御と各モードにおける機能制御を行う制御部を備える
電子機器。
(2)上記各モードにおける機能制御として所定の機能に関するレベル制御を含み、該レベルの上下に係る上記ジェスチャ動作の動きの方向と該レベルの上下とは相関が取られている
前記(1)に記載の電子機器。
(3)上記ジェスチャ動作は、時計回りおよび反時計回りの回転動作、あるいは一の方向および他の方向のフリック動作である
前記(2)に記載の電子機器。
(4)上記所定の機能に関するレベル制御は、再生音声のボリューム制御、再生音声に対するエフェクト効果の制御、ライティングの光量の制御、または再生曲の送りおよび戻しの制御である
前記(3)に記載の電子機器。
(5)上記ジェスチャ動作は、タップ動作を含み、
上記制御部は、上記タップ動作の検出出力に基づいて上記モード切り替えを制御する
前記(1)から(4)のいずれかに記載の電子機器。
(6)上記モードには、
走行系モードの他に、少なくともサウンドエフェクトモード、ライティングモードおよびサンプラーモードのいずれか一つを含む
前記(5)に記載の電子機器。
(7)上記制御部における上記センサー部の検出出力に基づく制御のオンオフを操作する操作ボタンをさらに備える
前記(1)から(6)のいずれかに記載の電子機器。
(8)上記センサー部は、
筐体上面の操作エリア面の直下に配置された、電界発生部と電界受信部を持つセンサー基板と、
上記センサー基板の上記電界受信部で得られる電界受信信号から上記ジェスチャ動作を検出する信号処理部を有し、
上記操作ボタンでオン操作がなされるとき、上記操作エリア面を発光により示す発光部をさらに備える
前記(7)に記載の電子機器。
(9)上記センサー基板は、円形であり、
上記センサー基板は、裏面に電界発生部を持つと共に表面の上下左右の外周側位置に電界受信部を持ち、
上記センサー基板は、上記上下左右の電界受信部の中央を貫通する貫通孔を有し、
上記発光部として、上記センサー基板の下部に配置された、上記貫通孔に対応した位置に発光素子を持つ発光素子基板を備える
前記(8)に記載の電子機器。
(10)上記センサー部は、
筐体上面の操作エリア面の直下に配置された、電界発生部と電界受信部を持つセンサー基板と、
上記センサー基板の上記電界受信部で得られる電界受信信号から上記ジェスチャ動作を検出する信号処理部を有し、
上記操作エリア面は、水平面から所定の方向に所定の角度だけ傾けられている
前記(1)から(9)のいずれかに記載の電子機器。
(11)上記センサー部は、
電界発生部と電界受信部を持つセンサー基板と、
上記センサー基板の上記電界受信部で得られる電界受信信号から上記ジェスチャ動作を検出する信号処理部と、
上記信号処理部の検出出力をリモートコントロール信号として送信する送信部を有し、
上記センサー部は、筐体上面に着脱自在に配置されている
前記(1)から(10)のいずれかに記載の電子機器。
(12)センサー部が、ユーザのジェスチャ動作を検出する第1のステップと、
制御部が、上記第1のステップで検出された上記ユーザのジェスチャ動作に基づいて、モードの切り替え制御と各モードにおける機能制御を行う制御ステップを有する
電子機器の制御方法。
101・・・筐体
102・・・スピーカーユニット
103・・・CD挿入口
104・・・発光部
105・・・ジェスチャ機能選択キー
106・・・ジェスチャ操作エリア面
106a・・・エリア(ユーザがジェスチャ動作を行い得る範囲)
107,108・・・表示部
109・・・発光部
110・・・ダイヤル操作部
111・・・センサー基板
112・・・基板本体
113・・・電界発生部
114・・・電界受信部
114w・・・ウエスト電界受信部
114s・・・サウス電界受信部
114e・・・イースト電界受信部
111n・・・ノース電界受信部
115・・・電界
116・・・DSP
117・・・CPU
118・・・貫通孔
119・・・発光素子基板
120・・・LED
121・・・システム制御部
122・・・再生制御部
123・・・音声処理部
131・・・音源部
132・・・キー入力部
133・・・アンプ
140・・・センサーユニット
141・・・送信部
150・・・コマンド表(ポップ広告)
Claims (12)
- ユーザのジェスチャ動作を検出するセンサー部と、
上記センサー部の検出出力に基づいて、モードの切り替え制御と各モードにおける機能制御を行う制御部を備える
電子機器。 - 上記各モードにおける機能制御として所定の機能に関するレベル制御を含み、該レベルの上下に係る上記ジェスチャ動作の動きの方向と該レベルの上下とは相関が取られている
請求項1に記載の電子機器。 - 上記ジェスチャ動作は、時計回りおよび反時計回りの回転動作、あるいは一の方向および他の方向のフリック動作である
請求項2に記載の電子機器。 - 上記所定の機能に関するレベル制御は、再生音声のボリューム制御、再生音声に対するエフェクト効果の制御、ライティングの光量の制御、または再生曲の送りおよび戻しの制御である
請求項3に記載の電子機器。 - 上記ジェスチャ動作は、タップ動作を含み、
上記制御部は、上記タップ動作の検出出力に基づいて上記モード切り替えを制御する
請求項1に記載の電子機器。 - 上記モードには、
走行系モードの他に、少なくともサウンドエフェクトモード、ライティングモードおよびサンプラーモードのいずれか一つを含む
請求項5に記載の電子機器。 - 上記制御部における上記センサー部の検出出力に基づく制御のオンオフを操作する操作ボタンをさらに備える
請求項1に記載の電子機器。 - 上記センサー部は、
筐体上面の操作エリア面の直下に配置された、電界発生部と電界受信部を持つセンサー基板と、
上記センサー基板の上記電界受信部で得られる電界受信信号から上記ジェスチャ動作を検出する信号処理部を有し、
上記操作ボタンでオン操作がなされるとき、上記操作エリア面を発光により示す発光部をさらに備える
請求項7に記載の電子機器。 - 上記センサー基板は、円形であり、
上記センサー基板は、裏面に電界発生部を持つと共に表面の上下左右の外周側位置に電界受信部を持ち、
上記センサー基板は、上記上下左右の電界受信部の中央を貫通する貫通孔を有し、
上記発光部として、上記センサー基板の下部に配置された、上記貫通孔に対応した位置に発光素子を持つ発光素子基板を備える
請求項8に記載の電子機器。 - 上記センサー部は、
筐体上面の操作エリア面の直下に配置された、電界発生部と電界受信部を持つセンサー基板と、
上記センサー基板の上記電界受信部で得られる電界受信信号から上記ジェスチャ動作を検出する信号処理部を有し、
上記操作エリア面は、水平面から所定の方向に所定の角度だけ傾けられている
請求項1に記載の電子機器。 - 上記センサー部は、
電界発生部と電界受信部を持つセンサー基板と、
上記センサー基板の上記電界受信部で得られる電界受信信号から上記ジェスチャ動作を検出する信号処理部と、
上記信号処理部の検出出力をリモートコントロール信号として送信する送信部を有し、
上記センサー部は、筐体上面に着脱自在に配置されている
請求項1に記載の電子機器。 - センサー部が、ユーザのジェスチャ動作を検出する第1のステップと、
制御部が、上記第1のステップで検出された上記ユーザのジェスチャ動作に基づいて、モードの切り替え制御と各モードにおける機能制御を行う制御ステップを有する
電子機器の制御方法。
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
BR112017012783A BR112017012783A2 (pt) | 2014-12-24 | 2015-12-16 | dispositivo eletrônico, e, método de controle de dispositivo eletrônico. |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2014259751 | 2014-12-24 | ||
JP2014-259751 | 2014-12-24 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2016104266A1 true WO2016104266A1 (ja) | 2016-06-30 |
Family
ID=56150291
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2015/085153 WO2016104266A1 (ja) | 2014-12-24 | 2015-12-16 | 電子機器および電子機器の制御方法 |
Country Status (2)
Country | Link |
---|---|
BR (1) | BR112017012783A2 (ja) |
WO (1) | WO2016104266A1 (ja) |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2004046063A (ja) * | 2002-05-17 | 2004-02-12 | Sony Corp | インタフェース要素体を備えた音楽装置 |
JP2004157994A (ja) * | 2002-10-07 | 2004-06-03 | Sony France Sa | 自由空間に作られるジェスチャを解析する方法及び装置 |
WO2008093683A1 (ja) * | 2007-01-31 | 2008-08-07 | Alps Electric Co., Ltd. | 静電容量式モーション検出装置及びそれを用いた入力装置 |
JP2010108273A (ja) * | 2008-10-30 | 2010-05-13 | Sony Corp | 情報処理装置、情報処理方法およびプログラム |
WO2013030862A1 (ja) * | 2011-08-26 | 2013-03-07 | パイオニア株式会社 | 表示装置、表示方法、およびプログラム |
JP2013196969A (ja) * | 2012-03-21 | 2013-09-30 | Panasonic Corp | 負荷コントローラ |
JP2013235588A (ja) * | 2012-05-04 | 2013-11-21 | Samsung Electronics Co Ltd | 空間上の相互作用に基づく端末の制御方法及びその端末 |
US20140003629A1 (en) * | 2012-06-28 | 2014-01-02 | Sonos, Inc. | Modification of audio responsive to proximity detection |
-
2015
- 2015-12-16 BR BR112017012783A patent/BR112017012783A2/pt not_active Application Discontinuation
- 2015-12-16 WO PCT/JP2015/085153 patent/WO2016104266A1/ja active Application Filing
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2004046063A (ja) * | 2002-05-17 | 2004-02-12 | Sony Corp | インタフェース要素体を備えた音楽装置 |
JP2004157994A (ja) * | 2002-10-07 | 2004-06-03 | Sony France Sa | 自由空間に作られるジェスチャを解析する方法及び装置 |
WO2008093683A1 (ja) * | 2007-01-31 | 2008-08-07 | Alps Electric Co., Ltd. | 静電容量式モーション検出装置及びそれを用いた入力装置 |
JP2010108273A (ja) * | 2008-10-30 | 2010-05-13 | Sony Corp | 情報処理装置、情報処理方法およびプログラム |
WO2013030862A1 (ja) * | 2011-08-26 | 2013-03-07 | パイオニア株式会社 | 表示装置、表示方法、およびプログラム |
JP2013196969A (ja) * | 2012-03-21 | 2013-09-30 | Panasonic Corp | 負荷コントローラ |
JP2013235588A (ja) * | 2012-05-04 | 2013-11-21 | Samsung Electronics Co Ltd | 空間上の相互作用に基づく端末の制御方法及びその端末 |
US20140003629A1 (en) * | 2012-06-28 | 2014-01-02 | Sonos, Inc. | Modification of audio responsive to proximity detection |
Also Published As
Publication number | Publication date |
---|---|
BR112017012783A2 (pt) | 2018-01-02 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9779710B2 (en) | Electronic apparatus and control method thereof | |
EP3668091A2 (en) | Display apparatus and controlling method thereof | |
US20130123961A1 (en) | Disc jockey controller for a handheld computing device | |
JP4552912B2 (ja) | カラオケ装置の操作パネル、カラオケ装置 | |
JP2016119030A (ja) | 電子機器および電子機器の制御方法 | |
WO2016104266A1 (ja) | 電子機器および電子機器の制御方法 | |
CN101310326B (zh) | 集显示设备于一体的薄型卡拉ok系统 | |
JP5644748B2 (ja) | オーディオ装置 | |
JP2016119031A (ja) | 電子機器および電子機器の制御方法 | |
JPWO2007113951A1 (ja) | Av処理装置およびプログラム | |
JP4942182B2 (ja) | カラオケシステム | |
JP4449755B2 (ja) | 制御装置 | |
KR101682214B1 (ko) | 전자잉크 키보드 | |
KR20180001323A (ko) | 스마트폰과 연결 가능한 전자드럼 | |
JP2010107644A (ja) | カラオケシステム、カラオケ用リモコン | |
JP2021040227A (ja) | 音信号処理方法、音信号処理システム、およびプログラム | |
KR101251416B1 (ko) | 디제잉 기능을 가지는 휴대용 음원출력장치 | |
JP2007052385A (ja) | オーディオ・ユーザー・インターフェース | |
JP7236685B2 (ja) | 音楽照明システム | |
JP2014107764A (ja) | 位置情報取得装置、およびオーディオシステム | |
JP7434083B2 (ja) | カラオケ装置 | |
JP3112698U (ja) | Midiによるled点灯制御装置 | |
US10420190B2 (en) | Audio apparatus, driving method for audio apparatus, and computer readable recording medium | |
JP2009238081A (ja) | タッチパネル式入力装置 | |
JP2010233157A (ja) | リモコン装置を備えた電子機器 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 15872835 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
REG | Reference to national code |
Ref country code: BR Ref legal event code: B01A Ref document number: 112017012783 Country of ref document: BR |
|
ENP | Entry into the national phase |
Ref document number: 112017012783 Country of ref document: BR Kind code of ref document: A2 Effective date: 20170614 |
|
NENP | Non-entry into the national phase |
Ref country code: JP |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 15872835 Country of ref document: EP Kind code of ref document: A1 |