WO2021229721A1 - Dispositif de commande haptique dans l'air, système haptique dans l'air et procédé de commande haptique dans l'air - Google Patents

Dispositif de commande haptique dans l'air, système haptique dans l'air et procédé de commande haptique dans l'air Download PDF

Info

Publication number
WO2021229721A1
WO2021229721A1 PCT/JP2020/019115 JP2020019115W WO2021229721A1 WO 2021229721 A1 WO2021229721 A1 WO 2021229721A1 JP 2020019115 W JP2020019115 W JP 2020019115W WO 2021229721 A1 WO2021229721 A1 WO 2021229721A1
Authority
WO
WIPO (PCT)
Prior art keywords
haptics
aerial
required accuracy
determination unit
unit
Prior art date
Application number
PCT/JP2020/019115
Other languages
English (en)
Japanese (ja)
Inventor
将平 近藤
Original Assignee
三菱電機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 三菱電機株式会社 filed Critical 三菱電機株式会社
Priority to JP2022515465A priority Critical patent/JP7072744B2/ja
Priority to PCT/JP2020/019115 priority patent/WO2021229721A1/fr
Publication of WO2021229721A1 publication Critical patent/WO2021229721A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer

Definitions

  • the present disclosure relates to an aerial haptics control device, an aerial haptics system, and an aerial haptics control method.
  • an “aerial display device” a device that displays an image in the air by projecting light into the air
  • a device that presents a tactile sensation in the air by transmitting ultrasonic waves in the air hereinafter referred to as "aerial haptics device”
  • HMI Human Machine Interface
  • Patent Document 1 discloses such a system.
  • the accuracy required for the tactile stimulus realized by the aerial haptics device (hereinafter referred to as "required accuracy”) varies depending on various factors. Specifically, for example, the required accuracy varies depending on whether the operation input by the user is fumbling or visual. Also, for example, the required accuracy varies depending on whether the HMI is simple or complex.
  • Patent Document 1 does not have a configuration for dealing with such fluctuations in required accuracy. Therefore, there is a problem that it is not possible to cope with the fluctuation of the required accuracy.
  • the present disclosure has been made to solve the above-mentioned problems, and an object of the present disclosure is to cope with fluctuations in the required accuracy of tactile stimuli in an aerial haptic device.
  • the aerial haptics control device determines the required accuracy by using the determination information acquisition unit that acquires the determination information used for determining the required accuracy for the tactile stimulus realized by the aerial haptics device and the determination information.
  • Driver selection that selects multiple drive target haptic drivers from multiple haptic drivers included in the haptic driver group in the aerial haptic device by the required accuracy judgment unit and the selection density that differs depending on the required accuracy. It is equipped with a part.
  • FIG. It is a block diagram which shows the main part of the aerial haptics system which concerns on Embodiment 1.
  • FIG. It is a block diagram which shows the main part of the aerial haptics apparatus in the aerial haptics system which concerns on Embodiment 1.
  • FIG. It is a block diagram which shows the main part of the drive control part among the control devices in the aerial haptics system which concerns on Embodiment 1.
  • FIG. It is explanatory drawing which shows the example of a plurality of drive target haptics drivers arranged in a grid pattern. It is explanatory drawing which shows the example of a plurality of drive target haptics drivers arranged in a checkered pattern.
  • FIG. 1 It is a block diagram which shows the hardware composition of the main part of the aerial haptics control apparatus in the aerial haptics system which concerns on Embodiment 1.
  • FIG. 2 is a block diagram which shows the other hardware composition of the main part of the aerial haptics control apparatus in the aerial haptics system which concerns on Embodiment 1.
  • FIG. 2 is a block diagram which shows the other hardware composition of the main part of the aerial haptics control apparatus in the aerial haptics system which concerns on Embodiment 1.
  • FIG. It is explanatory drawing which shows the example of the state which the user's line of sight is not directed to a display area. It is explanatory drawing which shows the example of the state in which a plurality of drive target haptics drivers are selected by the first selection density, and the drive frequency is set to the first drive frequency. It is explanatory drawing which shows the example of the state which a user's line of sight is directed to a display area.
  • FIG. 1 It is explanatory drawing which shows the example of the state which a plurality of drive target haptics drivers are selected by the 2nd selection density, and the drive frequency is set to the 2nd drive frequency.
  • FIG. 1 It is a block diagram which shows the main part of another aerial haptics system which concerns on Embodiment 1.
  • FIG. 2 It is a block diagram which shows the main part of another aerial haptics system which concerns on Embodiment 1.
  • FIG. It is a block diagram which shows the main part of the aerial haptics system which concerns on Embodiment 2.
  • It is a flowchart which shows the operation of the aerial haptics control apparatus in the aerial haptics system which concerns on Embodiment 2.
  • FIG. It is a flowchart which shows the operation of the required accuracy determination part of the aerial haptics control apparatus in the aerial haptics system which concerns on Embodiment 2.
  • FIG. It is explanatory drawing which shows the example of the image corresponding to the operation screen including the UI for slide operation. It is explanatory drawing which shows the example of the state in which a plurality of drive target haptics drivers are selected by the first selection density, and the drive frequency is set to the first drive frequency. It is explanatory drawing which shows the example of the image corresponding to the operation screen including the UI for tap operation. It is explanatory drawing which shows the example of the state which a plurality of drive target haptics drivers are selected by the 2nd selection density, and the drive frequency is set to the 2nd drive frequency.
  • FIG. 2 It is a block diagram which shows the main part of another aerial haptics system which concerns on Embodiment 2.
  • FIG. It is a block diagram which shows the main part of another aerial haptics system which concerns on Embodiment 2.
  • FIG. It is a block diagram which shows the main part of another aerial haptics system which concerns on Embodiment 2.
  • FIG. It is a block diagram which shows the main part of another aerial haptics system which concerns on Embodiment 2.
  • FIG. It is a block diagram which shows the main part of the aerial haptics system which concerns on Embodiment 3.
  • It is a flowchart which shows the operation of the aerial haptics control apparatus in the aerial haptics system which concerns on Embodiment 3.
  • FIG. It is a flowchart which shows the operation of the required accuracy determination part of the aerial haptics control apparatus in the aerial haptics system which concerns on Embodiment 3.
  • FIG. 3 It is a block diagram which shows the main part of another aerial haptics system which concerns on Embodiment 3.
  • FIG. It is a block diagram which shows the main part of another aerial haptics system which concerns on Embodiment 3.
  • FIG. It is a block diagram which shows the main part of another aerial haptics system which concerns on Embodiment 3.
  • FIG. It is a block diagram which shows the main part of another aerial haptics system which concerns on Embodiment 3.
  • FIG. 1 is a block diagram showing a main part of the aerial haptics system according to the first embodiment.
  • FIG. 2 is a block diagram showing a main part of an aerial haptics device in the aerial haptics system according to the first embodiment.
  • FIG. 3 is a block diagram showing a main part of a drive control unit among the control devices in the aerial haptics system according to the first embodiment.
  • the aerial haptics system according to the first embodiment will be described with reference to FIGS. 1 to 3.
  • the aerial haptics system 1 includes a control device 2, a line-of-sight detection device 3, an aerial haptics device 4, and an aerial display device 5.
  • the control device 2 includes a system control unit 11, a drive control unit 12, a display control unit 13, a current detection unit 14, and an operation detection unit 15. Further, the control device 2 includes a determination information acquisition unit 21, a required accuracy determination unit 22, a driver selection unit 23, and a frequency setting unit 24.
  • the determination information acquisition unit 21, the required accuracy determination unit 22, the driver selection unit 23, and the frequency setting unit 24 constitute the main part of the aerial haptics control device 100.
  • the control device 2 is composed of, for example, an in-vehicle information communication device. That is, the control device 2 is configured by, for example, an ECU (Electronic Control Unit).
  • ECU Electronic Control Unit
  • the control device 2 is composed of an in-vehicle information communication device. That is, an example in which the aerial haptics system 1 is for an in-vehicle use will be mainly described.
  • User U of the aerial haptics system 1 is, for example, a passenger of a vehicle (not shown). That is, the user U is, for example, the driver of the vehicle, the passenger of the vehicle, or the driver of the vehicle and the passenger of the vehicle. Hereinafter, an example in which the user U is the driver of the vehicle will be mainly described.
  • the line-of-sight detection device 3 detects the line-of-sight direction L of the user U.
  • the line-of-sight detection device 3 outputs information indicating the detected line-of-sight direction L (hereinafter referred to as “line-of-sight direction information”).
  • the line-of-sight detection device 3 is composed of, for example, a DMS (Driver Monitoring System) or an OMS (Occupant Monitoring System).
  • DMS Driver Monitoring System
  • OMS Olecupant Monitoring System
  • the aerial haptics device 4 uses the haptics driver group DG.
  • the haptics driver group DG includes a plurality of haptics drivers D.
  • Each haptics driver D is composed of, for example, an ultrasonic transducer.
  • a plurality of haptics drivers D are arranged one-dimensionally.
  • the plurality of haptics drivers D are arranged two-dimensionally.
  • M ⁇ N haptics drivers D are arranged in a matrix of N rows and M columns.
  • N is an arbitrary integer of 2 or more.
  • M is an arbitrary integer of 2 or more.
  • the aerial display device 5 displays an image in the air by projecting light into the air.
  • the area where the image is displayed by the aerial display device 5 (hereinafter referred to as “display area”) A1 corresponds to the area where the tactile sensation is presented by the aerial haptics device 4 (hereinafter referred to as “air haptics area”) A2. .. That is, the aerial display device 5 corresponds to the aerial haptics device 4.
  • the aerial display device 5 is configured by, for example, a 3D-HUD (Three-Dimensional Head-Up Display).
  • the system control unit 11 controls the operation of the entire control device 2. As a result, the system control unit 11 controls the operation of the entire aerial haptics system 1.
  • the system control unit 11 is composed of, for example, a dedicated circuit.
  • the drive control unit 12 executes control for driving each haptics driver D based on an instruction from the system control unit 11.
  • the drive control unit 12 is composed of, for example, a dedicated circuit.
  • the drive control unit 12 includes a carrier wave signal generation unit 31, a vibration wave signal generation unit 32, a modulation unit 33, and an amplification unit 34.
  • the carrier wave signal generation unit 31 is an electric signal (hereinafter referred to as “carrier wave”) corresponding to an ultrasonic wave (hereinafter referred to as “carrier wave”) having a predetermined frequency (hereinafter referred to as “carrier frequency”) f based on an instruction by the system control unit 11. It is called a "signal").
  • carrier wave signal generation unit 31 outputs the generated carrier wave signal to the modulation unit 33.
  • the vibration wave signal generation unit 32 is an electric signal (hereinafter referred to as “vibration”) corresponding to ultrasonic waves (hereinafter referred to as “vibration wave”) for realizing vibration corresponding to a desired tactile stimulus based on an instruction by the system control unit 11. It is called a "wave signal").
  • the vibration wave signal generation unit 32 outputs the generated vibration wave signal to the modulation unit 33.
  • the modulation unit 33 modulates the carrier wave signal output by the carrier wave signal generation unit 31 by using the vibration wave signal output by the vibration wave signal generation unit 32.
  • the modulation unit 33 outputs the modulated carrier wave signal (hereinafter referred to as “modulated wave signal”) to the amplification unit 34.
  • the amplification unit 34 amplifies the modulated wave signal output by the modulation unit 33. As a result, the output modulated wave signal is amplified to a predetermined level.
  • the amplification unit 34 outputs the amplified modulated wave signal (hereinafter referred to as “transmission signal”) to the haptics driver group DG.
  • the carrier wave signal generation unit 31 generates a carrier wave signal corresponding to each haptics driver D.
  • the vibration wave signal generation unit 32 generates a vibration wave signal corresponding to each haptics driver D.
  • the modulation unit 33 modulates the corresponding carrier signal using the corresponding vibration wave signal for each haptics driver D.
  • the amplification unit 34 amplifies the modulated wave signal corresponding to each haptics driver D.
  • the amplification unit 34 outputs a transmission signal corresponding to each haptics driver D.
  • each haptics driver D is driven. That is, each haptics driver D transmits ultrasonic US in the air. As a result, the antennae are presented to the aerial haptics region A2. That is, haptics in the aerial haptics region A2 are realized.
  • the ultrasonic US transmitted by the corresponding haptics driver D is reflected by the indicator P and reflected.
  • the ultrasonic US' is received by the corresponding haptics driver D.
  • the corresponding haptics driver D outputs an electric signal (hereinafter referred to as “received signal”) corresponding to the received ultrasonic wave US ′.
  • the display control unit 13 executes control to display images corresponding to various screens by using the aerial display device 5 based on the instruction from the system control unit 11.
  • the display control unit 13 is composed of, for example, a dedicated circuit.
  • the image displayed by the aerial display device 5 includes images corresponding to screens for various operations (hereinafter referred to as "operation screens”).
  • the UI (User Interface) on each operation screen includes a UI for operation input by hand gesture.
  • the UI on each operation screen is referred to as "screen UI”.
  • the screen UI includes a UI for operation input by slide operation (hereinafter referred to as “UI for slide operation”).
  • the screen UI includes a UI for operation input by flick operation (hereinafter referred to as “UI for flick operation”).
  • the screen UI includes a UI for operation input by tap operation (hereinafter referred to as “UI for tap operation”).
  • simple operations for example, tap operations
  • simple operations compared to slide operations or flick operations
  • simple operations for example, tap operations
  • UI for simple operation the UI for simple operation input by simple operation
  • the current detection unit 14 detects the current value I in each haptics driver D. More specifically, the current detection unit 14 detects the current value I_1 corresponding to the transmission signal and the current value I_2 corresponding to the reception signal for each haptics driver D.
  • the current detection unit 14 is composed of, for example, a dedicated circuit.
  • the operation detection unit 15 detects the operation input to the operation screen by the user U by using the current value I detected by the current detection unit 14.
  • the operation detection unit 15 is composed of, for example, a dedicated circuit.
  • the transmission signal is input to the corresponding haptics driver D, and the reception signal is output by the corresponding haptics driver D.
  • the received signal is attenuated with respect to the corresponding transmitted signal. Further, the received signal has a phase difference with respect to the corresponding transmitted signal.
  • the operation detection unit 15 detects the operation by the hand gesture based on the result of the determination. Specifically, for example, the operation detection unit 15 detects a slide operation, a flick operation, or a tap operation.
  • the determination information acquisition unit 21 acquires information (hereinafter referred to as "determination information") used for determination by the request accuracy determination unit 22 described later.
  • the determination information acquired by the determination information acquisition unit 21 includes the line-of-sight direction information. That is, the determination information acquisition unit 21 acquires the line-of-sight direction information output by the line-of-sight detection device 3.
  • the required accuracy determination unit 22 determines the accuracy (that is, required accuracy) RA required for the tactile stimulus realized by the aerial haptic device 4 by using the determination information acquired by the determination information acquisition unit 21. be.
  • the required accuracy determination unit 22 determines whether the required accuracy RA is one of the first required accuracy RA_1 and the second required accuracy RA_2, which are different from each other.
  • the first required accuracy RA_1 corresponds to a higher accuracy than the second required accuracy RA_2.
  • the second required accuracy RA_2 corresponds to a lower accuracy than the first required accuracy RA_1.
  • the required accuracy determination unit 22 determines whether or not the line of sight of the user U is directed to the display area A1 by using the line-of-sight direction information included in the acquired determination information. In other words, the required accuracy determination unit 22 determines whether or not the line of sight of the user U is directed to the aerial haptics region A2. As a result, the required accuracy determination unit 22 determines whether the operation input by the user U is fumbling or visual.
  • the required accuracy determination unit 22 determines that the required accuracy RA is the first required accuracy RA_1. do.
  • the required accuracy determination unit 22 has the required accuracy RA of the second required accuracy RA_2. Is determined.
  • the driver selection unit 23 selects a plurality of haptics drivers (hereinafter referred to as "drive target haptics drivers”) D_D among the M ⁇ N haptics drivers D included in the haptics driver group DG.
  • drive target haptics drivers a plurality of haptics drivers (hereinafter referred to as “drive target haptics drivers”) D_D among the M ⁇ N haptics drivers D included in the haptics driver group DG.
  • the densities (hereinafter referred to as “selective densities”) SD of a plurality of drive target haptics drivers D_D among the M ⁇ N haptics drivers D differ depending on the determination result by the required accuracy determination unit 22. It is a thing.
  • the driver selection unit 23 has a plurality of drivers selected by the first selective density SD_1 of the first selective density SD_1 and the second selective density SD_1 that are different from each other. Select the haptics driver D_D to be driven.
  • the driver selection unit 23 selects a plurality of drive target haptics drivers D_D by the second selection density SD_2.
  • the first selective density SD_1 corresponds to a higher density than the second selective density SD_1.
  • the second selective density SD_2 corresponds to a lower density than the first selective density SD_1.
  • the driver selection unit 23 drives all the haptics drivers D among the M ⁇ N haptics drivers D. Select for the target haptics driver D_D.
  • the driver selection unit 23 uses the plurality of haptics drivers D arranged in a grid pattern among the M ⁇ N haptics drivers D. Is selected as the drive target haptics driver D_D.
  • the driver selection unit 23 selects a plurality of haptics drivers D arranged in a checkered pattern among the M ⁇ N haptics drivers D as the haptics driver D_D to be driven.
  • haptics drivers D excluding a plurality of drive target haptics drivers D_D among M ⁇ N haptics drivers D may be referred to as “non-drive target haptics drivers”. Further, the code of "D_ND" may be used for the non-driven target haptics driver.
  • FIG. 4 shows an example of a plurality of drive target haptics drivers D_D arranged in a grid pattern. That is, as shown in FIG. 4, 25 haptics drivers D arranged in a matrix of 5 rows and 5 columns are included in the haptics driver group DG. Further, 16 drive target haptics drivers D_D and 9 non-drive target haptics drivers D_ND are included in the haptics driver group DG. The 16 driven haptics drivers D_D are arranged in a grid pattern.
  • FIG. 5 shows an example of a plurality of drive target haptics drivers D_D arranged in a checkered pattern. That is, as shown in FIG. 5, 25 haptics drivers D arranged in a matrix of 5 rows and 5 columns are included in the haptics driver group DG. Further, 12 drive target haptics drivers D_D and 13 non-drive target haptics drivers D_ND are included in the haptics driver group DG. The twelve drive target haptics drivers D_D are arranged in a checkered pattern.
  • the driver selection unit 23 instructs the drive control unit 12 to include each drive target haptics driver D_D as a drive target. In other words, the driver selection unit 23 instructs the drive control unit 12 to exclude each non-drive target haptics driver D_ND from the drive target.
  • the frequency setting unit 24 sets the frequency (hereinafter referred to as “driving frequency”) F of the ultrasonic wave US transmitted by each drive target haptics driver D_D to a different value according to the determination result by the required accuracy determination unit 22. It is something to do.
  • the frequency setting unit 24 sets the drive frequency F to a higher drive frequency among the different drive frequencies F_1 and F_1 (hereinafter, "first drive”). It is called “frequency”.) Set to F_1. This is realized, for example, by instructing the carrier signal generation unit 31 to generate a carrier signal corresponding to the higher carrier frequency f_1 among the different carrier frequencies f_1 and f_1.
  • the frequency setting unit 24 sets the drive frequency F to the lower drive frequency among the different drive frequencies F_1 and F_2 (hereinafter, "second drive”). It is called “frequency”.) Set to F_2. This is realized, for example, by instructing the carrier signal generation unit 31 to generate a carrier signal corresponding to the lower carrier frequency f_2 among the different carrier frequencies f_1 and f_2.
  • the drive control unit 12 includes each drive target haptics driver D_D as a drive target based on the instruction from the driver selection unit 23. In other words, the drive control unit 12 excludes each non-drive target haptics driver D_ND from the drive target based on the instruction from the driver selection unit 23.
  • the carrier wave signal generation unit 31 is adapted to generate a carrier wave signal corresponding to the carrier wave frequency f_1 or the carrier wave frequency f_2 based on the instruction by the frequency setting unit 24.
  • the main part of the aerial haptics system 1 is configured.
  • the processes executed by the determination information acquisition unit 21 may be collectively referred to as “determination information acquisition processing”. Further, the processes executed by the request accuracy determination unit 22 may be collectively referred to as “request accuracy determination process”. Further, the processes executed by the driver selection unit 23 may be collectively referred to as “driver selection process”. Further, the processes executed by the frequency setting unit 24 may be collectively referred to as "frequency setting process”.
  • the functions possessed by the determination information acquisition unit 21 may be collectively referred to as “determination information acquisition function”. Further, the functions of the required accuracy determination unit 22 may be collectively referred to as a “required accuracy determination function”. Further, the functions of the driver selection unit 23 may be collectively referred to as a “driver selection function”. Further, the functions of the frequency setting unit 24 may be collectively referred to as "frequency setting function”.
  • the code of "F1” may be used for the determination information acquisition function. Further, the reference numeral of “F2” may be used for the required accuracy determination function. In addition, the code of "F3” may be used for the driver selection function. Further, a code may be used for "F4" for the frequency setting function.
  • the aerial haptics control device 100 has a processor 41 and a memory 42.
  • the memory 42 stores programs corresponding to a plurality of functions (including a determination information acquisition function, a required accuracy determination function, a driver selection function, and a frequency setting function) F1 to F4.
  • the processor 41 reads and executes the program stored in the memory 42. As a result, a plurality of functions F1 to F4 are realized.
  • the aerial haptics control device 100 has a processing circuit 43.
  • the processing circuit 43 executes processing corresponding to a plurality of functions F1 to F4. As a result, a plurality of functions F1 to F4 are realized.
  • the aerial haptics control device 100 includes a processor 41, a memory 42, and a processing circuit 43.
  • a program corresponding to a part of the plurality of functions F1 to F4 is stored in the memory 42.
  • the processor 41 reads and executes the program stored in the memory 42. As a result, some of these functions are realized.
  • the processing circuit 43 executes processing corresponding to the remaining functions of the plurality of functions F1 to F4. As a result, such residual functions are realized.
  • the processor 41 is composed of one or more processors.
  • a CPU Central Processing Unit
  • a GPU Graphics Processing Unit
  • a microprocessor a microprocessor
  • a microprocessor a microprocessor
  • a DSP Digital Signal Processor
  • the memory 42 is composed of one or more non-volatile memories.
  • the memory 42 is composed of one or more non-volatile memories and one or more volatile memories. That is, the memory 42 is composed of one or more memories.
  • the individual memory uses, for example, a semiconductor memory or a magnetic disk. More specifically, each volatile memory uses, for example, a RAM (Random Access Memory).
  • the individual non-volatile memory is, for example, a ROM (Read Only Memory), a flash memory, an EPROM (Erasable Programmable Read Only Memory), an EEPROM (Electrically Erasable Programle) drive, a solid state drive O Is.
  • the processing circuit 43 is composed of one or more digital circuits.
  • the processing circuit 43 is composed of one or more digital circuits and one or more analog circuits. That is, the processing circuit 43 is composed of one or more processing circuits.
  • the individual processing circuits are, for example, ASIC (Application Specific Integrated Circuit), PLD (Programmable Logic Device), FPGA (Field Programmable Gate Array), System LSI (Sy), and System (Sy). Is.
  • the processor 41 when the processor 41 is composed of a plurality of processors, the correspondence between the plurality of functions F1 to F4 and the plurality of processors is arbitrary. That is, each of the plurality of processors may read and execute a program corresponding to one or more corresponding functions among the plurality of functions F1 to F4.
  • the processor 41 may include a dedicated processor corresponding to each of the plurality of functions F1 to F4.
  • each of the plurality of memories may store a program corresponding to one or more corresponding functions among the plurality of functions F1 to F4.
  • the memory 42 may include a dedicated memory corresponding to each of the plurality of functions F1 to F4.
  • the processing circuit 43 when the processing circuit 43 is composed of a plurality of processing circuits, the correspondence between the plurality of functions F1 to F4 and the plurality of processing circuits is arbitrary. That is, each of the plurality of processing circuits may execute processing corresponding to one or more corresponding functions among the plurality of functions F1 to F4.
  • the processing circuit 43 may include a dedicated processing circuit corresponding to each of the plurality of functions F1 to F4.
  • the determination information acquisition unit 21 executes the determination information acquisition process (step ST1).
  • the required accuracy determination unit 22 executes the required accuracy determination process (step ST2).
  • the driver selection unit 23 executes the driver selection process (step ST3).
  • the frequency setting unit 24 executes the frequency setting process (step ST4).
  • step ST2 the process executed in step ST2 will be described.
  • the request accuracy determination unit 22 determines whether or not the line of sight of the user U is directed to the display area A1 by using the line-of-sight direction information included in the determination information acquired in step ST1 (step ST11). ).
  • the request accuracy determination unit 22 requests the first request for the required accuracy RA. It is determined that the accuracy is RA_1 (step ST12).
  • step ST11 “YES” when the line of sight of the user U is directed to the display area A1 (step ST11 “YES”), that is, when the operation input by the user U is visual, the required accuracy determination unit 22 has the required accuracy RA. 2 It is determined that the required accuracy is RA_2 (step ST13).
  • FIG. 11 shows an example of a state in which the line of sight of the user U is not directed to the display area A1. That is, FIG. 11 shows an example of a state in which the user U is trying to input an operation by fumbling. More specifically, FIG. 11 shows an example of a state in which the driver's line of sight of the vehicle (not shown) is directed forward. That is, FIG. 11 shows an example of a state in which the driver of the vehicle is trying to input an operation while the vehicle is running.
  • the required accuracy determination unit 22 determines that the required accuracy RA is the first required accuracy RA_1. Then, the driver selection unit 23 selects a plurality of drive target haptics drivers D_D at the first selection density SD_1. Specifically, for example, all the haptics drivers D out of the 16 haptics drivers D arranged in a matrix of 4 rows and 4 columns are selected as the haptics driver D_D to be driven (see FIG. 12). .. Then, the drive frequency F is set to the first drive frequency F_1.
  • FIG. 13 shows an example of a state in which the line of sight of the user U is directed to the display area A1. That is, FIG. 13 shows an example of a state in which the user U is trying to visually input an operation. More specifically, FIG. 13 shows an example of a state in which the line of sight of the driver of the vehicle (not shown) is directed to the display area A1. That is, FIG. 13 shows an example of a state in which the driver of the vehicle is trying to input an operation while the vehicle is stopped.
  • the required accuracy determination unit 22 determines that the required accuracy RA is the second required accuracy RA_2. Then, the driver selection unit 23 selects a plurality of drive target haptics drivers D_D at the second selection density SD_2. Specifically, for example, eight haptics drivers D arranged in a checkered pattern among 16 haptics drivers D arranged in a matrix of 4 rows and 4 columns are used as the haptics driver D_D to be driven. It is set (see FIG. 14). Then, the drive frequency F is set to the second drive frequency F_2.
  • the higher the frequency of the ultrasonic wave the higher the directivity of the ultrasonic wave.
  • the lower the frequency of the ultrasonic wave the lower the directivity of the ultrasonic wave. Therefore, by setting the drive frequency F to the first drive frequency F_1, the directivity of the ultrasonic wave US transmitted by each drive target haptics driver D_D is increased (see FIG. 12).
  • the drive frequency F is set to the second drive frequency F_2, the directivity of the ultrasonic US transmitted by the individual drive target haptics drivers D_D becomes low (see FIG. 14).
  • the accuracy of the tactile stimulus is lowered, but the power consumption in the aerial haptic device 4 can be reduced.
  • the directivity of the ultrasonic US that is, using the second drive frequency F_2
  • the power consumption in the aerial haptic device 4 is reduced, and the tactile stimulus in the aerial haptic region A2 is partially performed. It can be suppressed from being missing. That is, it is possible to prevent the tactile stimuli in the aerial haptics region A2 from being lost in a reciprocal lattice pattern or the tactile stimuli in the aerial haptics region A2 from being missing in a checkered pattern.
  • the number X is reduced (that is, the second selection density SD_2 is used), and the directivity of the ultrasonic US is lowered (that is, the second drive frequency). F_2 is used). As a result, the power consumption of the aerial haptics device 4 can be reduced.
  • the plurality of drive target haptics drivers D_D are arranged in a checkered pattern, the following is compared with the case where the plurality of drive target haptics drivers D_D are arranged in a grid pattern. The effect can be obtained.
  • Target haptics drivers D_D may be adjacent to each other (see FIG. 4).
  • the two driven target haptics drivers D_D are not adjacent to each other (see FIG. 5).
  • adjacent means that they are adjacent in the row direction or the column direction in the matrix of N rows and M columns.
  • the number Y of drive target haptics drivers D_D adjacent to each drive target haptics driver D_D may differ. ..
  • the aerial haptics region is compared with the case where the plurality of drive target haptics drivers D_D are arranged in a grid pattern. It is possible to suppress the occurrence of unevenness of tactile stimulation in A2. In other words, the tactile stimulus in the aerial haptic region A2 can be easily homogenized. Moreover, such a tactile stimulus can be efficiently realized.
  • the aerial haptics control device 100 may not include the frequency setting unit 24. That is, the main part of the aerial haptics control device 100 may be configured by the determination information acquisition unit 21, the required accuracy determination unit 22, and the driver selection unit 23.
  • the aerial haptics system 1 may include a sensor 6.
  • the sensor 6 is composed of, for example, a camera or an infrared sensor.
  • the operation detection unit 15 may use the sensor 6 instead of the current value I when detecting the operation by the hand gesture.
  • Various known techniques can be used to detect the operation by the sensor 6. Detailed description of these techniques will be omitted.
  • the aerial haptics control device 100 has the determination information acquisition unit 21 for acquiring the determination information used for determining the required accuracy RA for the tactile stimulus realized by the aerial haptics device 4.
  • a driver selection unit 23 for selecting a plurality of drive target haptics drivers D_D among the drivers D is provided. This makes it possible to cope with fluctuations in the required accuracy RA.
  • the driver selection unit 23 determines that the required accuracy RA is the first required accuracy RA_1 higher than the second required accuracy RA_2 by the required accuracy determination unit 22, the first selection higher than the second selection density SD_2.
  • a plurality of drive target haptics drivers D_D are selected according to the density SD_1. This makes it possible to realize highly accurate tactile stimulation.
  • the aerial haptics control device 100 sets the drive frequency F in each of the plurality of drive target haptic drivers D_D to the second.
  • a frequency setting unit 24 for setting a first drive frequency F_1 higher than the drive frequency F_1 is provided. This makes it possible to realize highly accurate tactile stimulation.
  • the driver selection unit 23 makes a second selection lower than the first selection density SD_1.
  • a plurality of drive target haptics drivers D_D are selected by the density SD_2, and the aerial haptics control device 100 has a plurality of aerial haptics control devices 100 when the required accuracy determination unit 22 determines that the required accuracy RA is the second required accuracy RA_2.
  • a frequency setting unit 24 for setting the drive frequency F in each of the drive target haptics drivers D_D to the second drive frequency F_2 lower than the first drive frequency F_1 is provided. As a result, the amount of electric charge consumed by the aerial haptics device 4 can be reduced. In addition, it is possible to suppress the partial loss of tactile stimuli in the aerial haptic region A2.
  • the determination information includes the line-of-sight direction information indicating the line-of-sight direction L of the user U of the aerial haptics system 1 including the aerial haptics device 4 and the aerial display device 5 corresponding to the aerial haptics device 4. This makes it possible to determine the required accuracy RA according to whether the operation input by the user U is fumbling or visual.
  • the required accuracy determination unit 22 determines that the required accuracy RA is the first required accuracy RA_1 when the line of sight of the user U is not directed to the display area A1 in the aerial display device 5. Thereby, when the operation input by the user U is fumbling, it is possible to realize a highly accurate tactile stimulus.
  • the required accuracy determination unit 22 determines that the required accuracy RA is the second required accuracy RA_2. Thereby, when the operation input by the user U is visual, the power consumption in the aerial haptics device 4 can be reduced.
  • the determination information acquisition unit 21 acquires the determination information used for determining the required accuracy RA for the tactile stimulus realized by the aerial haptics device 4.
  • the haptics driver in the aerial haptics device 4 by the step ST2 in which the required accuracy determination unit 22 determines the required accuracy RA using the determination information and the selection density SD in which the driver selection unit 23 differs according to the required accuracy RA.
  • a step ST3 for selecting a plurality of driven target haptic drivers D_D among the plurality of haptic drivers D included in the group DG is provided. This makes it possible to cope with fluctuations in the required accuracy RA.
  • FIG. 18 is a block diagram showing a main part of the aerial haptics system according to the second embodiment.
  • the aerial haptics system according to the second embodiment will be described with reference to FIG.
  • the same blocks as those shown in FIG. 1 are designated by the same reference numerals and the description thereof will be omitted.
  • the aerial haptics system 1a includes a control device 2a, an aerial haptics device 4, and an aerial display device 5.
  • the control device 2a includes a system control unit 11, a drive control unit 12, a display control unit 13, a current detection unit 14, and an operation detection unit 15. Further, the control device 2a includes a determination information acquisition unit 21a, a required accuracy determination unit 22a, a driver selection unit 23, and a frequency setting unit 24.
  • the main part of the aerial haptics control device 100a is composed of the determination information acquisition unit 21a, the required accuracy determination unit 22a, the driver selection unit 23, and the frequency setting unit 24.
  • the determination information acquisition unit 21a acquires information (that is, determination information) used for determination by the required accuracy determination unit 22a, which will be described later.
  • the determination information acquired by the determination information acquisition unit 21a includes information indicating the screen UI being displayed in the aerial display device 5 (hereinafter referred to as "screen UI information").
  • the screen UI information is acquired from, for example, the system control unit 11.
  • the required accuracy determination unit 22a determines the accuracy (that is, required accuracy) RA required for the tactile stimulus realized by the aerial haptic device 4 by using the determination information acquired by the determination information acquisition unit 21a. be. More specifically, the required accuracy determination unit 22a determines whether the required accuracy RA is one of the first required accuracy RA_1 and the second required accuracy RA_2, which are different from each other.
  • the request accuracy determination unit 22a determines whether or not the screen UI being displayed is a UI for simple operation by using the screen UI information included in the acquired determination information.
  • the required accuracy determination unit 22a has the required accuracy RA first. It is determined that the required accuracy is RA_1.
  • the required accuracy determination unit 22a has a required accuracy RA of the second required accuracy RA_2. Judge that there is.
  • the main part of the aerial haptics system 1a is configured.
  • determination information acquisition processing the processes executed by the determination information acquisition unit 21a may be collectively referred to as “determination information acquisition processing”. Further, the functions of the determination information acquisition unit 21a may be collectively referred to as “determination information acquisition function”. Further, the reference numeral of "F1a" may be used for the determination information acquisition function.
  • the processes executed by the required accuracy determination unit 22a may be collectively referred to as “required accuracy determination process”. Further, the functions of the required accuracy determination unit 22a may be generically referred to as “required accuracy determination function”. Further, the reference numeral of "F2a" may be used for the required accuracy determination function.
  • the hardware configuration of the main part of the aerial haptics control device 100a is the same as that described with reference to FIGS. 6 to 8 in the first embodiment. Therefore, detailed description thereof will be omitted.
  • the aerial haptics control device 100a has a plurality of functions (including a determination information acquisition function, a required accuracy determination function, a driver selection function, and a frequency setting function) F1a, F2a, F3, and F4.
  • Each of the plurality of functions F1a, F2a, F3, and F4 may be realized by the processor 41 and the memory 42, or may be realized by the processing circuit 43.
  • the processor 41 may include a dedicated processor corresponding to each of the plurality of functions F1a, F2a, F3, and F4.
  • the memory 42 may include a dedicated memory corresponding to each of the plurality of functions F1a, F2a, F3, and F4.
  • the processing circuit 43 may include a dedicated processing circuit corresponding to each of the plurality of functions F1a, F2a, F3, and F4.
  • the determination information acquisition unit 21a executes the determination information acquisition process (step ST1a).
  • the required accuracy determination unit 22a executes the required accuracy determination process (step ST2a).
  • the driver selection unit 23 executes the driver selection process (step ST3).
  • the frequency setting unit 24 executes the frequency setting process (step ST4).
  • step ST2a the operation of the required accuracy determination unit 22a will be described with reference to the flowchart shown in FIG. That is, the process executed in step ST2a will be described.
  • the request accuracy determination unit 22a determines whether or not the screen UI being displayed is a UI for simple operation by using the screen UI information included in the determination information acquired in step ST1a (step). ST21).
  • the required accuracy determination unit 22a determines that the required accuracy RA is the first required accuracy RA_1 (step ST22). On the other hand, when the screen UI being displayed is a UI for simple operation (step ST21 “YES”), the required accuracy determination unit 22a determines that the required accuracy RA is the second required accuracy RA_2 (step ST23).
  • FIG. 21 shows an example of an image corresponding to an operation screen including a UI for slide operation. More specifically, an example of an image corresponding to a map screen is shown. In the figure, the arrow A indicates the slide range of the indicator body P in the slide operation.
  • the required accuracy determination unit 22a determines that the required accuracy RA is the first required accuracy RA_1. Then, the driver selection unit 23 selects a plurality of drive target haptics drivers D_D at the first selection density SD_1. Specifically, for example, all the haptics drivers D out of the 16 haptics drivers D arranged in a matrix of 4 rows and 4 columns are selected as the haptics driver D_D to be driven (see FIG. 22). .. Then, the drive frequency F is set to the first drive frequency F_1.
  • FIG. 23 shows an example of an image corresponding to an operation screen including a UI for tap operation. More specifically, an example of a video corresponding to the menu screen is shown. As shown in FIG. 23, the menu screen includes four buttons B_1 to B_1.
  • the required accuracy determination unit 22a determines that the required accuracy RA is the second required accuracy RA_2. Then, the driver selection unit 23 selects a plurality of drive target haptics drivers D_D at the second selection density SD_2. Specifically, for example, eight haptics drivers D arranged in a grid pattern out of 16 haptics drivers D arranged in a matrix of 4 rows and 4 columns are selected as the haptics driver D_D to be driven. (See FIG. 24). Then, the drive frequency F is set to the second drive frequency F_2.
  • a highly accurate tactile stimulus is required as compared with the case where the screen UI is a tap operation UI.
  • a high-precision tactile stimulus is not required as compared with the case where the screen UI is a UI for slide operation or a UI for flick operation.
  • the number X increases (that is, the first selection density SD_1 is used) as shown in FIG. 22. ), The directivity of the ultrasonic US is increased (that is, the first drive frequency F_1 is used). This makes it possible to realize highly accurate tactile stimulation.
  • the screen UI being displayed is a UI for simple operation (see FIG. 23)
  • the number X is reduced (that is, the second selection density SD_2 is used), and the ultrasonic US is used.
  • the directivity is low (ie, the second drive frequency F_2 is used). As a result, the power consumption of the aerial haptics device 4 can be reduced.
  • the aerial haptics control device 100a may not include the frequency setting unit 24. That is, the main part of the aerial haptics control device 100a may be configured by the determination information acquisition unit 21a, the required accuracy determination unit 22a, and the driver selection unit 23.
  • the aerial haptics system 1a may include a sensor 6.
  • the operation detection unit 15 may use the sensor 6 instead of the current value I when detecting the operation by the hand gesture.
  • the determination information includes the screen UI information indicating the screen UI in the aerial display device 5 corresponding to the aerial haptics device 4.
  • the required accuracy determination unit 22a determines that the required accuracy RA is the first required accuracy RA_1.
  • the required accuracy RA is the first required accuracy RA_1.
  • the required accuracy determination unit 22a determines that the required accuracy RA is the second required accuracy RA_2.
  • the power consumption in the aerial haptics device 4 can be reduced.
  • the power consumption in the aerial haptics device 4 can be reduced.
  • FIG. 28 is a block diagram showing a main part of the aerial haptics system according to the third embodiment.
  • the aerial haptics system according to the third embodiment will be described with reference to FIG. 28.
  • the same blocks as those shown in FIG. 1 are designated by the same reference numerals and the description thereof will be omitted.
  • the aerial haptics system 1b includes a control device 2b, a line-of-sight detection device 3, an aerial haptics device 4, and an aerial display device 5.
  • the control device 2b includes a system control unit 11, a drive control unit 12, a display control unit 13, a current detection unit 14, and an operation detection unit 15. Further, the control device 2b includes a determination information acquisition unit 21b, a required accuracy determination unit 22b, a driver selection unit 23, and a frequency setting unit 24.
  • the main part of the aerial haptics control device 100b is configured by the determination information acquisition unit 21b, the required accuracy determination unit 22b, the driver selection unit 23, and the frequency setting unit 24.
  • the determination information acquisition unit 21b acquires information (that is, determination information) used for determination by the required accuracy determination unit 22b, which will be described later.
  • the determination information acquired by the determination information acquisition unit 21b includes the line-of-sight direction information and the screen UI information.
  • the line-of-sight direction information is acquired from the line-of-sight detection device 3.
  • the screen UI information is acquired from, for example, the system control unit 11.
  • the required accuracy determination unit 22b determines the accuracy (that is, required accuracy) RA required for the tactile stimulus realized by the aerial haptic device 4 by using the determination information acquired by the determination information acquisition unit 21b. be. More specifically, the required accuracy determination unit 22b determines whether the required accuracy RA is one of the first required accuracy RA_1 and the second required accuracy RA_2, which are different from each other.
  • the required accuracy determination unit 22b determines whether or not the line of sight of the user U is directed to the display area A1 by using the line-of-sight direction information included in the acquired determination information. Further, the request accuracy determination unit 22b determines whether or not the screen UI being displayed is a UI for simple operation by using the screen UI information included in the acquired determination information.
  • the required accuracy determination unit 22b determines that the required accuracy RA is the first required accuracy RA_1. do. Further, when the line of sight of the user U is directed to the display area A1 (that is, when the operation input by the user U is visual), the screen UI being displayed is not a UI for simple operation (for example, being displayed). When the screen UI of is a UI for slide operation or a UI for flick operation), the required accuracy determination unit 22b determines that the required accuracy RA is the first required accuracy RA_1.
  • the screen UI being displayed is a UI for simple operation (for example, display).
  • the required accuracy determination unit 22b determines that the required accuracy RA is the second required accuracy RA_2.
  • the main part of the aerial haptics system 1b is configured.
  • determination information acquisition processing the processes executed by the determination information acquisition unit 21b may be collectively referred to as “determination information acquisition processing”. Further, the functions of the determination information acquisition unit 21b may be collectively referred to as “determination information acquisition function”. Further, the reference numeral of "F1b" may be used for the determination information acquisition function.
  • the processes executed by the required accuracy determination unit 22b may be collectively referred to as “required accuracy determination process”. Further, the functions of the required accuracy determination unit 22b may be collectively referred to as “required accuracy determination function”. Further, the reference numeral of "F2b" may be used for the required accuracy determination function.
  • the hardware configuration of the main part of the aerial haptics control device 100b is the same as that described with reference to FIGS. 6 to 8 in the first embodiment. Therefore, detailed description thereof will be omitted.
  • the aerial haptics control device 100b has a plurality of functions (including a determination information acquisition function, a required accuracy determination function, a driver selection function, and a frequency setting function) F1b, F2b, F3, and F4.
  • Each of the plurality of functions F1b, F2b, F3, and F4 may be realized by the processor 41 and the memory 42, or may be realized by the processing circuit 43.
  • the processor 41 may include a dedicated processor corresponding to each of the plurality of functions F1b, F2b, F3, and F4.
  • the memory 42 may include a dedicated memory corresponding to each of the plurality of functions F1b, F2b, F3, and F4.
  • the processing circuit 43 may include a dedicated processing circuit corresponding to each of the plurality of functions F1b, F2b, F3, and F4.
  • the determination information acquisition unit 21b executes the determination information acquisition process (step ST1b).
  • the required accuracy determination unit 22b executes the required accuracy determination process (step ST2b).
  • the driver selection unit 23 executes the driver selection process (step ST3).
  • the frequency setting unit 24 executes the frequency setting process (step ST4).
  • step ST2b the operation of the required accuracy determination unit 22b will be described with reference to the flowchart shown in FIG. That is, the process executed in step ST2b will be described.
  • the required accuracy determination unit 22b determines whether or not the line of sight of the user U is directed to the display area A1 by using the line-of-sight direction information included in the determination information acquired in step ST1b (step ST31). Further, the request accuracy determination unit 22b determines whether or not the screen UI being displayed is a UI for simple operation by using the screen UI information included in the determination information acquired in step ST1b (step). ST32).
  • the required accuracy determination unit 22b determines that the required accuracy RA is the first required accuracy RA_1 (step ST33). Further, when the line of sight of the user U is directed to the display area A1 (step ST31 “YES”) and the screen UI being displayed is not a UI for simple operation (step ST32 “NO”), the required accuracy determination unit. 22b determines that the required accuracy RA is the first required accuracy RA_1 (step ST33).
  • the required accuracy determination Unit 22b determines that the required accuracy RA is the second required accuracy RA_2 (step ST34).
  • the required accuracy RA can be determined according to the line-of-sight direction L and the screen UI. Then, it is possible to realize a highly accurate tactile stimulus or reduce the power consumption in the aerial haptics device 4 according to the determined required accuracy RA.
  • the aerial haptics control device 100b may not include the frequency setting unit 24. That is, the main part of the aerial haptics control device 100b may be configured by the determination information acquisition unit 21b, the required accuracy determination unit 22b, and the driver selection unit 23.
  • the aerial haptics system 1b may include a sensor 6.
  • the operation detection unit 15 may use the sensor 6 instead of the current value I when detecting the operation by the hand gesture.
  • the determination information is the user U of the aerial haptics system 1b including the aerial haptics device 4 and the aerial display device 5 corresponding to the aerial haptics device 4.
  • the line-of-sight direction information indicating the line-of-sight direction L is included, and the screen UI information indicating the screen UI in the aerial display device 5 is included.
  • the required accuracy determination unit 22b determines that the required accuracy RA is the first required accuracy RA_1 when the line of sight of the user U is not directed to the display area A1 in the aerial display device 5. Thereby, when the operation input by the user U is fumbling, it is possible to realize a highly accurate tactile stimulus.
  • the required accuracy RA is the first required accuracy RA_1 when the screen UI is not a UI for simple operation. Is determined to be.
  • the screen UI is not a UI for simple operation (for example, when the screen UI is a UI for slide operation or a UI for flick operation), high accuracy is achieved. Tactile stimulus can be realized.
  • the required accuracy RA is the second required accuracy. It is determined that it is RA_2.
  • the operation input by the user U is visual, and the screen UI is a UI for simple operation (for example, when the screen UI is a UI for tap operation), the power consumption in the aerial haptics device 4 Can be reduced.
  • the aerial haptics control device, the aerial haptics system, and the aerial haptics control method according to the present disclosure can be used, for example, in an in-vehicle information communication device.
  • 1,1a, 1b aerial haptics system 1,2a, 2b control device, 3 line-of-sight detection device, 4 aerial haptics device, 5 aerial display device, 6 sensors, 11 system control unit, 12 drive control unit, 13 display control unit , 14 Current detection unit, 15 Operation detection unit, 21,21a, 21b Judgment information acquisition unit, 22, 22a, 22b Requirement accuracy judgment unit, 23 Driver selection unit, 24 Frequency setting unit, 31 Carrier signal generation unit, 32 Vibration Wave signal generation unit, 33 modulation unit, 34 amplification unit, 41 processor, 42 memory, 43 processing circuit, 100, 100a, 100b aerial haptics control device, D haptics driver, DG haptics driver group.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

Un dispositif de commande haptique dans l'air (100) comprend : une unité d'acquisition d'informations de détermination (21) qui acquiert des informations de détermination utilisées pour déterminer la précision requise (RA) pour un stimulus tactile réalisé avec un dispositif haptique dans l'air (4) ; une unité de détermination de précision requise (22) qui détermine la précision requise (RA) à l'aide des informations de détermination ; et une unité de sélection de conducteur (23) qui sélectionne, en fonction des densités de sélection (SD) qui diffèrent en fonction de la précision requise (RA), de multiples pilotes haptiques à exciter (D_D) parmi de multiples pilotes haptiques (D) inclus dans un groupe de commande haptique (DG) du dispositif haptique dans l'air (4).
PCT/JP2020/019115 2020-05-13 2020-05-13 Dispositif de commande haptique dans l'air, système haptique dans l'air et procédé de commande haptique dans l'air WO2021229721A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
JP2022515465A JP7072744B2 (ja) 2020-05-13 2020-05-13 空中ハプティクス制御装置、空中ハプティクスシステム及び空中ハプティクス制御方法
PCT/JP2020/019115 WO2021229721A1 (fr) 2020-05-13 2020-05-13 Dispositif de commande haptique dans l'air, système haptique dans l'air et procédé de commande haptique dans l'air

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2020/019115 WO2021229721A1 (fr) 2020-05-13 2020-05-13 Dispositif de commande haptique dans l'air, système haptique dans l'air et procédé de commande haptique dans l'air

Publications (1)

Publication Number Publication Date
WO2021229721A1 true WO2021229721A1 (fr) 2021-11-18

Family

ID=78525537

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2020/019115 WO2021229721A1 (fr) 2020-05-13 2020-05-13 Dispositif de commande haptique dans l'air, système haptique dans l'air et procédé de commande haptique dans l'air

Country Status (2)

Country Link
JP (1) JP7072744B2 (fr)
WO (1) WO2021229721A1 (fr)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH03236610A (ja) * 1990-02-14 1991-10-22 Tech Res & Dev Inst Of Japan Def Agency アクティブフェーズドアレイアンテナ装置
JP2012183103A (ja) * 2011-03-03 2012-09-27 Fujifilm Corp 超音波診断装置および超音波画像生成方法
JP2012208262A (ja) * 2011-03-29 2012-10-25 Dainippon Printing Co Ltd 電子書籍装置
JP2015064703A (ja) * 2013-09-24 2015-04-09 株式会社デンソー 触覚提示装置
JP2017162195A (ja) * 2016-03-09 2017-09-14 株式会社Soken 触覚提示装置
JP2019530049A (ja) * 2016-07-22 2019-10-17 ハーマン インターナショナル インダストリーズ インコーポレイテッド 触覚振動子デバイスによる反響定位

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5641653A (en) 1989-10-31 1997-06-24 The Texas A&M University System DNA encoding Actinobacillus pleuropneumoniae hemolysin
CN107688478A (zh) 2016-08-05 2018-02-13 阿里巴巴集团控股有限公司 终端、应用信息的显示方法及装置

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH03236610A (ja) * 1990-02-14 1991-10-22 Tech Res & Dev Inst Of Japan Def Agency アクティブフェーズドアレイアンテナ装置
JP2012183103A (ja) * 2011-03-03 2012-09-27 Fujifilm Corp 超音波診断装置および超音波画像生成方法
JP2012208262A (ja) * 2011-03-29 2012-10-25 Dainippon Printing Co Ltd 電子書籍装置
JP2015064703A (ja) * 2013-09-24 2015-04-09 株式会社デンソー 触覚提示装置
JP2017162195A (ja) * 2016-03-09 2017-09-14 株式会社Soken 触覚提示装置
JP2019530049A (ja) * 2016-07-22 2019-10-17 ハーマン インターナショナル インダストリーズ インコーポレイテッド 触覚振動子デバイスによる反響定位

Also Published As

Publication number Publication date
JP7072744B2 (ja) 2022-05-20
JPWO2021229721A1 (fr) 2021-11-18

Similar Documents

Publication Publication Date Title
EP3628121B1 (fr) Dispositif électronique pour mémoriser des informations de profondeur en relation avec une image en fonction des propriétés d'informations de profondeur obtenues à l'aide d'une image, et son procédé de commande
JP2020509847A5 (fr)
US10332268B2 (en) Image processing apparatus, generation method, and non-transitory computer-readable storage medium
US8334817B2 (en) Image display system, image display method, image display program, recording medium, data processing device, and image display device utilizing a virtual screen
US20200267296A1 (en) Electronic device comprising plurality of cameras using rolling shutter mode
US20160133169A1 (en) Apparatus and method for correcting image distortion and curved display device including the same
KR20190047790A (ko) 디스플레이를 이용하여 지문을 인식하기 위한 전자 장치
CN108628563B (zh) 显示装置、显示方法以及存储介质
JP2013148870A5 (fr)
US10423323B2 (en) User interface apparatus and method
JP2016133640A5 (fr)
CN105427778A (zh) 显示器以及用于处理其弯曲图像的方法
US20100002130A1 (en) Image processing apparatus and method
CN104272245A (zh) 过扫描支持
CN108873040A (zh) 用于检测道路层位置的方法和设备
US10009583B2 (en) Projection system, projection apparatus, information processing method, and storage medium
WO2021229721A1 (fr) Dispositif de commande haptique dans l'air, système haptique dans l'air et procédé de commande haptique dans l'air
US10960759B2 (en) Apparatus and method for automatically setting speed of vehicle
KR102276863B1 (ko) 이미지 처리장치 및 이미지 처리방법
CN105051788A (zh) 使用多个图元进行图形处理
JP2007139901A (ja) 車載用画像表示制御装置
WO2021229720A1 (fr) Dispositif de commande haptique aérien, système haptique aérien et procédé de commande haptique aérien
US20180359430A1 (en) Image processing device, image processing system, and non-transitory storage medium
JP2010128567A (ja) カーソル移動制御方法及び装置、プログラム
KR101314731B1 (ko) 다중 빔프로젝터 기반의 에지 블렌딩 시스템 및 에지 블렌딩 방법

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20935626

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2022515465

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20935626

Country of ref document: EP

Kind code of ref document: A1