WO2024057375A1 - Dispositif de tableau d'exploitation comportant une unité d'estimation de position basée sur un apprentissage automatique - Google Patents

Dispositif de tableau d'exploitation comportant une unité d'estimation de position basée sur un apprentissage automatique Download PDF

Info

Publication number
WO2024057375A1
WO2024057375A1 PCT/JP2022/034129 JP2022034129W WO2024057375A1 WO 2024057375 A1 WO2024057375 A1 WO 2024057375A1 JP 2022034129 W JP2022034129 W JP 2022034129W WO 2024057375 A1 WO2024057375 A1 WO 2024057375A1
Authority
WO
WIPO (PCT)
Prior art keywords
information
display screen
received
display panel
transparent
Prior art date
Application number
PCT/JP2022/034129
Other languages
English (en)
Japanese (ja)
Inventor
尭之 北村
聡 影目
Original Assignee
三菱電機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 三菱電機株式会社 filed Critical 三菱電機株式会社
Priority to PCT/JP2022/034129 priority Critical patent/WO2024057375A1/fr
Publication of WO2024057375A1 publication Critical patent/WO2024057375A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means

Definitions

  • the present disclosure relates to an operation panel device having a function of estimating the position of an object on a display screen of a display panel.
  • Patent Document 1 proposes a sensing screen device that is a non-contact operation panel as a UI.
  • the sensing screen device shown in Patent Document 1 includes a first antenna unit that transmits a sensing signal and a plurality of antenna units that receive reflected signals of the sensing signal on a transparent antenna layer disposed directly above the display screen of the sensing screen.
  • the second antenna unit includes a second antenna unit which, when the reflected signal of the sensing signal is received, calculates the coordinates of the contact object on the surface where the sensing screen is located, based on the power of the received reflected signal and the second antenna unit that receives the reflected signal on the surface. Determine according to the coordinates of the antenna unit.
  • the sensing screen device disclosed in Patent Document 1 estimates the position of a contact object using the second antenna unit that has received a high-power reflected signal among a plurality of second antenna units that have received reflected signals. Therefore, there is a problem that the accuracy of estimating the position of a contact object is poor.
  • the present disclosure is intended to solve the above-mentioned problems, and aims to provide an operation panel device that can estimate the position of an object on a display screen of a display panel with high accuracy.
  • An operation panel device includes a display panel having a display screen, a plurality of transparent antennas arranged two-dimensionally on the display screen of the display panel, and antennas that reflect light from an object present on the display screen of the display panel.
  • a reception information generation unit that generates reception information corresponding to the plurality of transparent antennas that are linked to the transparent antenna that received the reflected wave based on reception signals from the plurality of transparent antennas that received the reflected wave; and a display.
  • a storage unit that stores teacher information corresponding to a plurality of transparent antennas linked to position information on the display screen of the panel, and a teacher information stored in the storage unit for receiving information generated by the reception information generation unit.
  • the apparatus includes an object position estimation unit that performs machine learning using information to estimate the position of the object on the display screen of the display panel.
  • the target object position estimating unit performs machine learning using the teacher information stored in the storage unit on the received information corresponding to the plurality of transparent antennas generated by the received information generating unit, and Since the position of the object on the display screen of the display panel is estimated, the position of the object on the display screen of the display panel can be estimated with high accuracy.
  • FIG. 1 is a block diagram showing an operation panel device according to Embodiment 1.
  • FIG. FIG. 2 is a schematic diagram showing the surface of a panel in which a transparent antenna array is embedded in the operation panel device according to the first embodiment.
  • 3 is an explanatory diagram showing up-chirp signals Tx(1) to Tx(N) generated by a signal generator in the operation panel device according to the first embodiment.
  • FIG. 2 is a schematic front view showing the relationship between an instruction image displayed on a display screen of a panel and a finger in the operation panel device according to the first embodiment;
  • FIG. 3 is a hardware configuration diagram that is realized by dedicated hardware and shows a first example of a target object position estimating section 40 in the operation panel device according to the first embodiment.
  • FIG. 3 is a hardware configuration diagram of a computer realized by software, firmware, or the like, showing a second example of the object position estimating section 40 in the operation panel device according to the first embodiment.
  • 7 is a flowchart illustrating a processing procedure of a position estimation operation in which a finger position is estimated in the operation panel device according to the first embodiment.
  • 7 is a flowchart showing a processing procedure of a calibration execution operation for generating teacher information in the operation panel device according to the first embodiment.
  • FIG. 7 is a schematic front view showing the relationship between an instruction image displayed on the display screen of the panel and a finger in the operation panel device according to the second embodiment.
  • FIG. 7 is a block diagram showing an operation panel device according to a third embodiment.
  • FIG. 7 is a block diagram showing an operation panel device according to a fourth embodiment.
  • FIG. 7 is a block diagram showing an operation panel device according to a fifth embodiment.
  • the operation panel device according to the first embodiment will be explained using FIGS. 1 to 8.
  • the operation panel device according to the first embodiment is a non-contact type operation panel having a calibration function, and includes a display panel (hereinafter abbreviated as panel) 10, a transparent antenna array 20, a transmitting/receiving circuit section 30, and object position estimation. It includes a section 40 and a control section 50. That is, the operation panel device according to the first embodiment allows an object such as a finger or a stylus (hereinafter, a finger will be described as a representative object) to approach the panel 10 without physically contacting the panel 10. This is a user interface (UI) that allows you to operate the screen.
  • UI user interface
  • the operation panel device performs machine learning using teacher information on received information based on received signals from a plurality of transparent antennas forming a transparent antenna array 20 that have received reflected waves from a finger. It has a function of estimating the position of the finger on the display screen of No. 10, that is, a position estimation function. Further, the operation panel device has a function of acquiring a plurality of pieces of teacher information each associated with received information based on received signals from a plurality of transparent antennas, that is, a function of performing calibration.
  • the transparent antenna array 20 includes a plurality of transparent antennas 2 11 - 2 , which are two-dimensionally spread and embedded throughout the interior of the panel 10 in order to improve the estimation accuracy of the finger position by MIMO (Multiple-Input Multiple-Output). Consisting of 2 mn .
  • MIMO Multiple-Input Multiple-Output
  • the plurality of transparent antennas 2 11 to 2 mn are two-dimensionally arranged in m rows and n columns over the entire display screen of the panel 10, as shown in FIG. m and n are natural numbers, and at least one of them is a natural number of 2 or more.
  • the transparent antenna 2 11 will be referred to as the first transparent antenna 2 1
  • the transparent antenna 2 12 will be referred to as the second transparent antenna 2 2
  • the transparent antenna 2 mn will be referred to as the Nth transparent antenna 2 N.
  • the transparent antenna array 20 is constituted by N transparent antennas from the first transparent antenna 21 to the Nth transparent antenna 2N.
  • N is a natural number of 2 or more.
  • the transparent antenna 2k is a transmitting/receiving antenna that serves as both a transmitting antenna and a receiving antenna.
  • the transparent antenna 2k may have separate transmitting antennas and receiving antennas.
  • the number of receiving antennas is N
  • the number of transmitting antennas may be smaller than the number of receiving antennas, for example, N/2.
  • the transparent antenna 2k When the transparent antenna 2k functions as a transmitting antenna, it emits a transmitting wave consisting of a radio wave onto the display screen of the panel 10, that is, into the space where the finger is present.
  • the transmitted wave is, for example, a high frequency wave generated by a transmitted signal in a high frequency band such as L band, S band, C band, or X band.
  • the transparent antenna 2k When the transparent antenna 2k functions as a receiving antenna, it receives the reflected wave reflected by the finger present on the display screen of the panel 10, and outputs a received signal based on the reflected wave.
  • the transparent antenna 2k is an antenna created using a transparent conductor such as a metal mesh with fine wiring, an indium tin oxide film, or graphene, and is an antenna whose shape is not visible to the naked eye.
  • the transmitting/receiving circuit section 30 includes an input/output switching section 31, a signal transmitting section 32, and a receiving information generating section 33.
  • the transmitting/receiving circuit section 30 sequentially selects from the first transparent antenna 21 to the Nth transparent antenna 2N every cycle, outputs a high frequency transmission signal to the selected transparent antenna 2k , and outputs a high frequency transmission signal to the selected transparent antenna 2k.
  • the first transparent antenna 21 to the Nth transparent antenna 2N receives the reflected wave reflected from the finger present on the display screen of the panel 10 for each k transmitted wave. Based on the received signal, the reception information corresponding to the transparent antenna 2k selected as the transmitting antenna and the first transparent antenna 21 to Nth transparent antenna 2N associated with the transparent antenna that received the reflected wave is transmitted. Generate and output.
  • the input/output switching section 31 transmits the high frequency transmission signal from the signal transmitting section 32 to the transparent antenna 2k selected by the signal transmitting section 32, and transmits the high frequency transmission signal from the signal transmitting section 32 to the first transparent antenna 2k that receives the reflected wave reflected from the finger.
  • the received signals from the first to Nth transparent antennas 2N are transmitted to the received information generation section 33.
  • the switching unit 31k is a circulator having three ports that is generally known in the field of high frequency signals.
  • the switching unit 31k has an input port, an input/output port, and an output port.
  • the input port of the switching unit 31 k is connected to the output end of the signal transmitting unit 32
  • the input/output port is connected to the corresponding transparent antenna 2 k
  • the output port is connected to the reception information generating unit 33 .
  • the input port and the input/output port are connected when transmitting, and the input/output port and the output port are connected when receiving.
  • the switching unit 31k connects the transparent antenna 2k and the received information generating unit 33.
  • the switching unit 31k selected by the signal transmitting unit 32 connects the corresponding transparent antenna 2k and the signal transmitting unit 32, outputs the transmission signal from the signal transmitting unit 32 to the transparent antenna 2k, and then switches the transparent antenna 2k to the transparent antenna 2k. k and the reception information generation section 33 are brought into a connected state.
  • the signal transmitting section 32 sequentially selects the first switching section 311 to the Nth switching section 31N every cycle and outputs a high frequency transmission signal to the selected switching section 31k .
  • the selection order of the first switching unit 31 1 to the Nth switching unit 31 N is the first switching unit 31 1 , the second switching unit 31 2 , . . . , the Nth switching unit 31 N.
  • the transparent antennas are selected from the left end to the right end in the first row, from the left end to the right end in the second row, and from the left end to the right end in the m-th row.
  • the selection order from the first switching unit 31 1 to the Nth switching unit 31 N is the Nth switching unit 31 N , the (N-1)th switching unit 31 N-1 , . . .
  • the switching order may be the same as the switching unit 311 , and the switching order does not matter.
  • the signal transmitter 32 includes a signal generator 321 and an output destination selector 322.
  • the signal generator 321 generates an up-chirp signal or a down-chirp signal according to the Frequency Modulated Continuous Wave (FMCW) method or the Fast-Chirp Modulation (FCM) method, and transmits one signal.
  • N transmission signals are output every cycle.
  • the signal generator 321 generates up-chirp signals Tx1 to TxN whose frequencies change over time every transmission cycle, and generates up-chirp signals Tx1 to TxN. ⁇ TxN frequencies are upconverted to high frequency band signals, and the upconverted signals are output to the output destination selection section 322 as transmission signals TX1 ⁇ TXN. Examples of high frequency bands include frequency bands such as L band, S band, C band, and X band.
  • the signal generator 321 generates up-chirp signals Tx1 to TxN based on a timing signal indicating time from the control unit 50.
  • the number N of transmission signals TX1 to TXN in one transmission cycle is the same as the number N of the first transparent antennas 21 to Nth transparent antennas 2N , and in the first embodiment, the number N of transmission signals TX1 to TXN in one transmission cycle is
  • the N-th transmission signals TX1 to TX1 are associated with the first transparent antennas 21 to N-th transparent antennas 2N .
  • the order in which the up-chirp signals Tx1 to TxN are generated by the signal generator 321 is the same as the order in which the first transparent antenna 21 to the Nth transparent antenna 2N are selected as transmitting antennas.
  • the order is the second signal Tx2, . . . , the Nth signal TxN.
  • the transmission signal TXk may be a signal based on a pulse signal instead of a signal based on a signal whose frequency changes over time, such as an up-chirp signal and a down-chirp signal.
  • the output destination selection unit 322 acquires the transmission signal TXk from the signal generator 321 in synchronization with the timing signal indicating the time from the control unit 50, and is connected to the transparent antenna 2k corresponding to the acquired transmission signal TXk.
  • the switching unit 31k is selected and the transmission signal TXk is output to the selected switching unit 31k .
  • the output end of the output destination selection section 322 is connected to the input ports of the first switching section 311 to the Nth switching section 31N .
  • the reception information generation unit 33 converts the reception signals from the first transparent antennas 21 to Nth transparent antennas 2N that have received the reflected waves from the fingers present on the display screen of the panel 10 into the first switching mode.
  • the transparent antenna 2k selected as the transmitting antenna and the transparent antenna 2k that received the reflected wave are linked based on the received signal that is input through the N-th switching unit 31N from the unit 311 .
  • Reception information corresponding to the first transparent antenna 21 to the Nth transparent antenna 2N is generated and output.
  • the input end of the signal receiving section 33k is connected to the output port of the corresponding switching section 31k .
  • the signal receiving section 33k receives the received signal from the corresponding transparent antenna 2k via the corresponding switching section 31k .
  • the signal receiving section 33 k includes a down-converting section that down-converts the frequency of a high frequency band of the received signal to a frequency of an intermediate frequency band, and a receiving signal that is an analog signal down-converted by the down-converting section and receives the received signal that is a digital signal. It has an analog/digital conversion section that performs A/D conversion to a signal.
  • the signal receiving unit 33k receives a time signal indicating the time from the control unit 50, and receives information on the transparent antenna 2k selected as the transmitting antenna, the timing at which the selected transparent antenna 2k radiated the transmission wave, The corresponding transparent antenna 2 k receives the reflected wave from the finger, acquires the timing at which the received signal is received via the corresponding switching unit 31 k , and also acquires the timing of receiving the received signal via the corresponding switching unit 31 k . Obtain frequency and intensity.
  • the time when the selected transparent antenna 2k radiates the transmission wave is determined by the signal receiving unit 33k .
  • the time at which a transmission wave is radiated from the antenna 2 k is input via the control unit 50 . Further, the time at which the received signal is received is acquired by the signal receiving section 33k based on a signal indicating the time from the control section 50.
  • the signal receiving unit 33 k uses the acquired information to link the transparent antenna 2 k selected as the transmitting antenna and the transparent antenna 2 k that received the reflected wave based on the received signal received via the corresponding switching unit 31 k .
  • the received information S (t, g, h) is generated as digital data and output to the object position estimating section 40.
  • the reception information S(t, g, h) includes information indicating the frequency and intensity of the reception signal acquired from the transparent antenna 2k .
  • the received information S(t, g, h) may include a change in the shape of the spectrum of the received signal.
  • reception information S (t, g, h) indicates the sampling time with the sampling start time at each reception antenna set to 0 (reception), and g is the reception time of the reflected wave, which is digital data, by the transparent antenna 2k .
  • the signal generator 321 outputs the transmission signal TX1, and the output destination selection section 322 selects the first switching section 311 .
  • the transmission signal TX1 is input to the first transparent antenna 21 , and the first transparent antenna 21 emits a transmission wave consisting of a radio wave onto the display screen of the panel 10, that is, the space where the finger is present.
  • the k-th signal receiving unit 33 k receives a received signal based on the reflected wave from the corresponding transparent antenna 2 k that received the reflected wave reflected from the finger via the corresponding switching unit 31 k , the received information S Generate (t, 1, k).
  • the reception information S is generated in all of the first signal reception units 33 1 to 33 N , so the reception information generation unit 33 generates N received information S(t, 1, h). Furthermore, in one transmission cycle, since the first transparent antenna 2 1 to the Nth transparent antenna 2 N are selected as transmitting antennas in order, the received information generation unit 33 generates (N ⁇ N) pieces of received information S( t, g, h) and output to the target object position estimation section 40.
  • the object position estimation unit 40 has two functions.
  • the first function is the reception information S(t, g, h) generated by the reception information generation unit 33, in this example, (N ⁇ N) pieces of reception information S(t, h) generated in one transmission cycle.
  • This is a function that performs machine learning using teacher information for g and h) to estimate the position of the finger on the display screen of the panel 10.
  • a learning model is created by performing machine learning using teacher information, and the position of the finger on the display screen of the panel 10 is estimated using the learning model.
  • the second function is to display an instruction image IP for placing a finger on the display screen of the panel 10, and to receive a reflected wave reflected from the finger at a setting location along the displayed instruction image IP.
  • reception information corresponding to the transparent antenna that received the reflected wave that is, per transmission cycle generated by the reception information generation unit 33
  • This is a function of generating teacher information S(t, g, h
  • x and y are the x and y coordinates on the display screen of the finger located at the set position along the instruction image IP that received the reflected wave.
  • a vertical coordinate (z coordinate) may be added to the display screen of the panel 10 as position information.
  • x, y) is the position information (x, y) on the display screen of the panel 10 based on the instruction image IP and the first All received information S ( t , g , This information is generated as a pair of h).
  • the object position estimation section 40 includes a reception information acquisition section 41 and a machine learning section 42.
  • the reception information acquisition unit 41 acquires reception information S(t, g, h) generated by each of the first signal reception units 33 1 to 33 N.
  • the reception information acquisition unit 41 receives (N ⁇ N) information per transmission cycle c, which is generated by the reception information generation unit 33 composed of the first signal reception unit 331 to the first signal reception unit 33N .
  • the received information S(t, g, h) is acquired and provided to the machine learning unit 42.
  • the machine learning unit 42 has the above-described first function and second function.
  • the machine learning section 42 includes a position estimation section 421, a teacher information storage section 422, and a teacher information generation section 423.
  • the position estimation unit 421 receives the reception information S(t, g, h) acquired by the reception information acquisition unit 41, in this example, (N ⁇ N) pieces of reception information S(t, g, h) per one transmission cycle c.
  • Machine learning is performed using , the position (X, Y) of the finger on the display screen of the panel 10 is estimated, and the estimated position of the finger is output.
  • the machine learning performed by the position estimation unit 421 is machine learning using a machine learning method such as deep learning or Gaussian process regression. Instead of the machine learning method using deep learning and Gaussian process regression, other machine learning frameworks may be used. When using deep learning as a machine learning method, the weight parameters that connect each layer correspond to teacher information.
  • x, y) acquired at all positions (x, y) is used as learning data input, and the true finger position is Learn a deep learning model whose output is (x, y). All weight parameters in the deep learning model are adjusted by back propagation or the like so that the deep learning model outputs a value close to the true finger position (x, y). All these weight parameters correspond to teacher information in the deep learning model.
  • Gram matrix parameters correspond to training information.
  • x, y) acquired at all positions (x, y) is calculated using a default kernel function.
  • a Gram matrix of M rows and M columns is generated using these kernel function outputs as matrix elements.
  • M is the number of all learning positions (x, y).
  • the predetermined kernel function is, for example, a Gaussian kernel.
  • the unknown finger position (X, Y) is a finger position estimation method using Gaussian process regression, and this Gram matrix corresponds to the teacher information.
  • the parameters of the supervised machine learning model correspond to the supervised information.
  • the teacher information generation section 423 is controlled by the control section 50 in cooperation with the panel 10 and the transmitting/receiving circuit section 30 during a calibration operation to acquire a plurality of pieces of teacher information.
  • the teacher information generation unit 423 causes the instruction image IP to display the finger on the display screen of the panel 10 via the control unit 50.
  • the instruction image IP is an image for giving an instruction to the user to slowly trace the slowly advancing leading part with a finger.
  • the instruction image IP moves from the upper left end of the display screen of the panel 10 to the upper right end, goes down one step, moves back and forth several times from the right end to the left end, and slowly draws a drawing line that ends from the lower right end to the lower left end.
  • This is a drawn image showing a zigzag trajectory.
  • the leading portion of the drawn line that is the instruction image IP moves slowly on the display screen of the panel 10, and the moved portion becomes a darker drawn line IPb.
  • the thin drawn line IPa represents the drawn line from which the dark drawn line IPb extends.
  • the drawn line which is the instruction image IP, indicates an instruction that the user slowly traces with his or her finger on the display screen of the panel 10, following the movement of the leading part. It takes several tens of seconds for the leading portion to move all over the thin drawing line IPa.
  • D is a natural number of 2 or more. That is, at each setting position P d , the teacher information generating unit 423 inputs the transmission signal from the signal transmitting unit 32 to the selected transparent antenna 2 k via the switching unit 31 k , and inputs the transmission signal from the signal transmitting unit 32 to the selected transparent antenna 2 k .
  • a transmission wave is emitted from the
  • the first transparent antenna 2 receives the reflected wave reflected from the finger in response to the transmitted wave from the selected transparent antenna 2 k.
  • the first transparent antenna 2 receives the reflected wave based on the received signal from the 1st to Nth transparent antennas 2 N.
  • the teacher information generation unit 423 acquires the reception information corresponding to the transparent antenna, that is, the reception information S(t, g, h) generated by the reception information generation unit 33, via the reception information acquisition unit 41.
  • the teacher information generation unit 423 slowly draws a zigzag trajectory from the beginning (upper left end) to the end (lower left end) of the leading part of the drawing line that is the instruction image IP, and sets the set position P of the drawn line where the leading part is located.
  • the reception information S(t, g, h) generated by the reception information generation section 33 is acquired via the reception information acquisition section 41.
  • the teacher information generation unit 423 transmits the information via the reception information acquisition unit 41.
  • (N ⁇ N) pieces of reception information S(t, g, h) are acquired from the reception information generation unit 33. Therefore, until the finger on the display screen of the panel 10 finishes tracing the drawing line following the beginning of the drawing line that is the instruction image IP, the first setting position P 1 to the Dth setting position P D In all cases, the teacher information generation unit 423 acquires (N ⁇ N) received information S(t, g, h) from the reception information generation unit 33 via the reception information acquisition unit 41.
  • the teacher information generation unit 423 At each setting position P d , the teacher information generation unit 423 generates a panel at the beginning of the instruction image IP at the setting position P d in which the first transparent antenna 2 1 to the Nth transparent antenna 2 N are sequentially selected as transmitting antennas.
  • the calibration result data is a pair of position information (x d , y d ) on the display screen of 10 and (N ⁇ N) received information S (t, g, h) from the received information generation unit 33.
  • x d , y d ) is generated, and teacher information S (t, g, h
  • the positional information of the leading portion is expressed by x-axis and y-axis coordinates on the display screen of the panel 10.
  • the coordinates of the z-axis in the direction perpendicular to the display screen of the panel 10, that is, the xy plane, may be added as the position information of the leading part.
  • each of the reception information acquisition section 41 and the machine learning section 42 is realized by dedicated hardware including a reception information acquisition circuit and a machine learning circuit, as shown in FIG.
  • the reception information acquisition circuit corresponds to the reception information acquisition section 41
  • the machine learning circuit corresponds to the machine learning section 42.
  • the reception information acquisition circuit is an interface circuit that transmits the reception information S (t, g, h) from the reception information generation section 33 to the machine learning circuit, and a generally known interface circuit can be used.
  • the machine learning circuit includes a position estimation circuit that constitutes a position estimation section 421 , a memory that is a teacher information storage section 422 , and a teacher information generation circuit that constitutes a teacher information generation section 423 .
  • Each of the position estimation circuit and the teacher information generation circuit is, for example, a single circuit, a composite circuit, a programmed processor, a parallel programmed processor, an ASIC (Application Specific Integrated Circuit), an FPGA (Field-Programmable Gate Array), or This is achieved by a combination of these.
  • the reception information acquisition section 41 and the machine learning section 42 are realized by a computer incorporating a program of software, firmware, or a combination of software and firmware, as shown in FIG.
  • FIG. 6 shows a typical example of a computer, which includes a CPU (Central Processing Unit), an input interface section, a RAM (Random Access Memory), a ROM (Read Only Memory), and an output interface section.
  • the CPU controls and manages the input interface section, RAM, ROM, and output interface section.
  • the CPU functions as a position estimator 421 and a teacher information generator 423 in cooperation with the ROM and RAM.
  • the ROM includes a position estimation program that executes a position estimation process to function as the position estimation unit 421, and a position estimation program that executes a process procedure to generate teacher information to function as the teacher information generation unit 423.
  • a teacher information generation program which is a calibration execution program, is stored.
  • ROM is a type of recording medium, and is a DVD (Digital Versatile Disc), CD (Compact Disc), HDD (Hard Disk Drive), or USB memory.
  • the RAM functions as the teacher information storage section 422, and also plays the role of temporarily storing the program stored in the ROM when the CPU executes the program stored in the ROM.
  • the input interface section corresponds to the reception information acquisition section 41 that acquires the reception information S (t, g, h) from the reception information generation section 33, and when the computer performs the calibration execution operation, the panel 10 based on the instruction information.
  • the position information (x, y) of the leading part on the display screen of is acquired.
  • the output interface unit When the computer performs a position estimation operation, the output interface unit outputs the estimated finger position, which is the estimation result by the CPU, ROM, and RAM functioning as the position estimation unit 421, and when the computer performs a calibration operation, the output interface unit
  • the CPU, ROM, and RAM functioning as the information generation unit 423 output instruction image information and the like to display the instruction image IP on the display screen of the panel 10.
  • the computer uses a CPU, it is also possible to use a central processing unit, processing unit, arithmetic unit, microprocessor, microcomputer, processor, or hardware that executes a DSP (Digital Signal Processor) program. Good too. Further, the present invention is not limited to the first and second examples, and some of the components may be realized by dedicated hardware, and the remaining components may be realized by software, firmware, or the like.
  • DSP Digital Signal Processor
  • the first transparent antenna 21 radiates a transmitted wave into the space where the finger is present.
  • the transmission wave radiated from the first transparent antenna 21 is reflected by the finger, and the first to Nth transparent antennas 2N receive the reflected wave by the finger (step ST13).
  • the received signals from the reflected waves received by the first transparent antennas 2 1 to Nth transparent antennas 2N are transmitted to the first signal receiving unit via the first switching unit 311 to the Nth switching unit 31N , respectively. 331 to the first signal receiving section 33N .
  • Each of the first signal receiving units 33_1 to 33_N performs a receiving process on each of the input receiving signals, and generates receiving information S(t, 1, h( 1 to N)) based on the received receiving signals, which indicates that the transparent antenna selected as the transmitting antenna is the first transparent antenna 2_1 , the time when the first transparent antenna 2_1 emitted a transmitting wave, and the time when the corresponding transparent antenna 2_k received a reflected wave from the finger (step ST14).
  • Reception information S (t, 1, h (1 to N)) when the first transparent antenna 2 1 is selected as the transmitting antenna, respectively, from the first signal receiving unit 33 1 to the first signal receiving unit 33 N
  • the output destination selection unit 322 switches the second transmission signal TX2 from the signal generator 321 to a second switching mode. 312 and outputs it to the second transparent antenna 22 .
  • a second transparent antenna 22 radiates a transmitted wave into the space where the finger is present.
  • the transmission wave radiated from the second transparent antenna 22 is reflected by the finger, and the first to Nth transparent antennas 2N receive the reflected wave by the finger (step ST13).
  • the received signals from the reflected waves received by the first transparent antennas 2 1 to Nth transparent antennas 2N are transmitted to the first signal receiving unit via the first switching unit 311 to the Nth switching unit 31N , respectively. 331 to the first signal receiving section 33N .
  • Each of the first signal receiving units 33 1 to 33 N performs reception processing on each input received signal, and the transparent antenna selected as the transmitting antenna is used as the second transparent antenna 2 2 , the reception information S ( t, 2, h (1 to N)) (step ST14).
  • step ST12 to step ST15 is repeated until the Nth transparent antenna 2N is selected as the transmitting antenna, the transparent antenna 2k is selected as the transmitting antenna in order, and the loop from step ST12 to step ST15 is repeated.
  • (N ⁇ N) pieces of received information S(t, g, h) are generated by the receiving unit 331 to the first signal receiving unit 33N .
  • the reception information generation unit 33 generates a first transparent antenna 2 two-dimensionally arranged on the display screen of the panel 10 that receives a reflected wave reflected from a finger present on the display screen of the panel 10. 1 to N-th transparent antennas 2 Transparent antennas 2 to which a received signal from N is input and which receives a reflected wave based on the input received signals. This is a step of generating reception information S(t, g, h) corresponding to the transparent antenna 2N .
  • step ST15 the reception information acquisition unit 41 acquires (N ⁇ N) reception information S(t, g, h) from the reception information generation unit 33, and (N ⁇ N) reception information S(t , g, h) to the position estimation unit 421.
  • the position estimation unit 421 calculates a plurality of position information (x, y) of the display screen of the panel 10 stored in the teacher information storage unit 422 for the (N ⁇ N) pieces of received information S (t, g, h).
  • Machine learning is performed using a plurality of pieces of teacher information S (t, g, h
  • step ST16 the position estimating unit 421 in the target object position estimating unit 40 inputs the (N ⁇ N) received information S(t, g, h) generated by the received information generating unit 33 into the teacher information storage unit.
  • This is a step of performing machine learning using the teacher information S (t, g, h
  • a program for executing the process of step ST16 is stored in the ROM, and the CPU executes the program stored in the ROM.
  • the program stored in the ROM includes first transparent antennas 2 1 to 1 that are two-dimensionally arranged on the display screen of the panel 10 that receive reflected waves from a finger present on the display screen of the panel 10 .
  • N 's transparent antenna 2 A first transparent antenna 2 associated with the transparent antenna that received the reflected wave based on the received signal from N.
  • the teacher information generating section 423 is controlled by the control section 50 in cooperation with the panel 10 and the transmitting/receiving circuit section 30.
  • step ST21 the teacher information generation unit 423 outputs instruction image information for displaying the instruction image IP on the display screen of the panel 10 to the panel 10 via the control unit 50.
  • the panel 10 displays a zigzag thin drawing line IPa on the display screen, and the leading part slowly moves along the thin drawing line IPa, and the moved part becomes a dark drawing line IPb.
  • an instruction image IP indicating an instruction for the user to slowly trace with a finger following the movement of the beginning of the dark drawn line IPb is displayed (step ST22).
  • the position information (x 1 , y 1 ) of the beginning of the instruction image IP at the first setting position P 1 is used as teacher information.
  • the generation unit 423 acquires it (step ST23).
  • the teacher information generation unit 423 generates the first transparent antenna 21 to the Nth transparent antenna 21 to N 's transparent antenna 2 Acquires (N ⁇ N) pieces of reception information S(t, g, h) generated by the reception information generation section 33 based on the reception signal received by the N through the reception information acquisition section 41. (step ST24).
  • step ST24 the (N ⁇ N) pieces of reception information S(t, g, h) acquired by the teacher information generation unit 423 are generated by the reception information generation unit 33 in steps ST11 to ST15 in the position estimation operation ( This is the same as N ⁇ N) pieces of received information S(t, g, h). That is, although a detailed explanation will be omitted, at the first set position P1 , all of the first transparent antenna 21 to the Nth transparent antenna 2N are selected as transmitting antennas in order, and steps ST12 to ST15 are performed. The loop is repeated, and (N ⁇ N) received information S(t, g, h) generated by the first signal receiving unit 331 to the first signal receiving unit 33N is used to generate teacher information. The unit 423 obtains the information.
  • step ST25 the teacher information generation unit 423 generates the acquired position information (x 1 , y 1 ) of the first setting position P 1 and the (N ⁇ N) received information at the acquired first setting position P 1.
  • x 1 , y 1 ) at the first setting position P 1 is generated using S(t, g, h) as a pair.
  • the teacher information generation unit 423 stores the generated teacher information S (t, g, h
  • step ST24 the teacher information generation unit 423 sets the first transparent antenna at the second setting position P2 to the first transparent antenna when each of the first to Nth transparent antennas 2N is used as a transmitting antenna. (N ⁇ N) pieces of reception information S(t, g, h) generated by the reception information generation unit 33 are obtained from the reception signals received by the Nth transparent antennas 2N from 21 .
  • the teacher information generation unit 423 generates the acquired position information (x 2 , y 2 ) of the second setting position P 2 and the (N ⁇ N) received information at the acquired second setting position P 2 .
  • the teacher information generation unit 423 generates teacher information S (t, g, h
  • step ST27 d of the setting position P d is D, that is, the position set over the entire area of the instruction image IP on the display screen of the panel 10, that is, the first setting position P 1 to the Dth setting.
  • the teacher information generation unit 423 provides the panel 10 with instruction image information for displaying an instruction image IP that causes a finger to be present on the display screen of the panel 10, and
  • the first transparent antenna 2 receives the reflected waves reflected from the fingers existing along the image IP.
  • the first to Nth transparent antennas 2 receive the received signals from N , and receive the reflected waves based on the received received signals.
  • a program for executing the processes from step ST21 to step ST27 is stored in the ROM, and the CPU executes the program stored in the ROM.
  • the program stored in the ROM provides the panel 10 with instruction image information for displaying an instruction image IP that causes the finger to be present on the display screen of the panel 10, and causes the finger to exist along the instruction image IP displayed on the display screen of the panel 10.
  • the reception information S ( t , g , h) to generate teacher information S(t, g, h
  • the operation panel device includes the first transparent antenna 21 to the Nth transparent antenna 2N arranged two - dimensionally on the display screen of the panel 10,
  • the received information generation unit 33 generates reflected waves based on the received signals from the first transparent antennas 21 to Nth transparent antennas 2N that have received the reflected waves from the fingers present on the display screen of the panel 10.
  • x, y) stored in teacher information storage unit 422 is performed on received information S (t, g, h) generated by information generation unit 33. Since the position of the finger on the display screen of the panel 10 is estimated by performing the above steps, the accuracy of estimating the position of the finger is improved without the finger touching the display screen of the panel 10.
  • the teacher information generation unit 423 provides the panel 10 with instruction image information for displaying an instruction image IP that causes a finger to be present on the display screen of the panel 10, and displays the instruction image information on the display screen of the panel 10.
  • a first transparent antenna 2 that has received a reflected wave reflected from a finger existing along the displayed instruction image IP.
  • a transparent antenna that has received a reflected wave based on a reception signal from the 1st to Nth transparent antennas 2N.
  • Embodiment 2 An operation panel device according to Embodiment 2 will be described using FIG. 9.
  • the teacher information generating unit 423 in the operation panel device according to the first embodiment generates reflected waves based on the received signals from all of the first transparent antennas 21 to Nth transparent antennas 2N that have received the reflected waves reflected from the fingers.
  • x, y) is generated based on received information S(t, g, h) corresponding to the transparent antenna that received the wave.
  • the teacher information generation unit 423 generates instructions displayed on the display screen of the panel 10 among the first transparent antennas 21 to Nth transparent antennas 2N.
  • Reception information S (t, g, h ) is used to generate the teacher information S(t, g, h
  • the operation panel device according to the second embodiment is different from the operation panel device according to the first embodiment in this point, and is the same in other points. Note that in FIG. 9, the same reference numerals as those given in FIGS. 1 to 8 indicate the same or equivalent parts.
  • the position estimation operation is the same as the position estimation operation in the operation panel device according to the first embodiment, so a description thereof will be omitted. Therefore, the calibration execution operation, that is, the teacher information S (t, g, h
  • the calibration execution operation that is, the teacher information S (t, g, h
  • the generation operation will be explained with reference to FIG.
  • instruction image information for displaying the instruction image IP on the display screen of the panel 10 is generated as teacher information.
  • the unit 423 outputs the output to the panel 10 via the control unit 50 (step ST21), and as shown in FIG.
  • the leading part moves slowly and the moved part becomes a dark drawn line IPb, thereby creating an instruction image IP that indicates an instruction for the user to slowly trace with a finger following the movement of the leading part of the dark drawn line IPb.
  • Display step ST22.
  • the teacher information generation unit 423 acquires it (step ST23). According to instructions from the teacher information generation unit 423, a plurality of transparent antennas existing around the first set position P1 among the first transparent antenna 21 to the Nth transparent antenna 2N , in this example, for example, The four surrounding transparent antennas are activated, ie, turned on. For example, in FIG. 9, as shown by the solid line frame, at the setting position P d , there are four transparent antennas surrounding the setting position P d : transparent antenna 2 12 , transparent antenna 2 13 , transparent antenna 2 22 , and transparent antenna 2 23 . is activated.
  • the transparent antennas other than the four activated transparent antennas are inactivated, that is, turned off.
  • FIG. 9 it is indicated by a broken line frame. Note that in this example, the activation and deactivation of the transparent antenna is not limited to turning the transparent antenna on and off, but rather, the activation and deactivation of the transparent antenna is not limited to turning the transparent antenna on and off. This includes turning the receiver on and off.
  • a plurality of activated transparent antennas from the first transparent antenna 21 to the Nth transparent antenna 2N are For example, (4 ⁇ 4) pieces of received information S(t, g, h) is acquired by the teacher information generation unit 423 via the reception information acquisition unit 41.
  • the teacher information generation unit 423 uses the acquired position information (x 1 , y 1 ) of the first setting position P 1 and the acquired position information (x 1 , y 1 ).
  • the teacher information generation unit 423 acquires and generates the teacher information S(t, g , h
  • a plurality of transparent antennas existing around the second set position P2 are activated, and received information S(t, g, h) based on the received signal from the activated transparent antenna and the second set position Teacher information S(t, g, h
  • each setting position is moved in the order of setting positions from the first setting position P1 to the Dth setting position PD until the beginning of the dark drawn line IPb in the instruction image IP reaches the Dth setting position PD.
  • x , y) is generated and stored in the teacher information storage unit 422, the operation of generating the teacher information is completed.
  • x, y) is a step in which the teacher information generating unit 423 provides the panel 10 with instruction image information for displaying an instruction image IP for indicating a finger to be present on the display screen of the panel 10, and generates teacher information S(t, g, h
  • x, y) to be stored in the teacher information storage unit 422 is executed.
  • a program is stored in the ROM, and the CPU executes the program stored in the ROM.
  • the program stored in the ROM provides the panel 10 with instruction image information for displaying an instruction image IP that causes the finger to be present on the display screen of the panel 10; Display of the panel 10 among the first transparent antennas 2 1 to Nth transparent antennas 2 N that have received reflected waves from the fingers existing along the instruction image IP displayed on the display screen of the panel 10 Receive received signals from a plurality of activated antennas existing around the position on the display screen of the panel 10 pointed to by the instruction image IP displayed on the screen, and receive reflected waves used in the received received signals.
  • the received information S (t, g, h) corresponding to the transmitted transparent antenna is stored in the teacher information storage unit 422 in which position information (x, y) on the display screen of the panel 10 based on the instruction image IP is linked.
  • This is a teacher information generation program for estimating the position of an object on the display screen of an operation panel device, which includes a procedure for generating teacher information S (t, g, h
  • the operation panel device has the same effects as the operation panel device according to the first embodiment.
  • the teacher information generation unit 423 generates an instruction image IP displayed on the display screen of the panel 10 among the first transparent antennas 21 to Nth transparent antennas 2N .
  • a plurality of activated antennas exist around the position on the display screen of the panel 10 pointed by the finger and correspond to the transparent antenna that received the reflected wave based on the received signal that received the reflected wave reflected from the finger.
  • Teacher information S(t, g, h) stored in the teacher information storage unit 422 is associated with position information (x, y) on the display screen of the panel 10 based on the instruction image IP. , g, h
  • x, y) can be generated efficiently, and the memory capacity in the teacher information storage section 422 can be used efficiently.
  • Embodiment 3 An operation panel device according to Embodiment 3 will be described using FIG. 10.
  • the operation panel device according to the third embodiment is different from the operation panel device according to the first embodiment in that a reception sensitivity determining section 43 is further provided, and other points are the same.
  • the reception sensitivity determination unit 43 detects the reception sensitivity of the wave reflected from the finger by the transparent antenna 2k , and determines the teacher information S (t, g, h
  • the transparent antenna 2k detects the reflection from the finger due to the presence of dirt on the display screen of the panel 10 or the occurrence of cracks on the display screen of the panel 10. You can know the deterioration of wave receiving sensitivity.
  • the same reference numerals as those given in FIGS. 1 to 8 indicate the same or equivalent parts.
  • the position estimation operation and the calibration execution operation that is, the teacher whose position information (x, y) on the display screen of the panel 10 is associated with calibration result data.
  • x, y) is the position estimation operation according to the flowchart shown in FIG. 7 and the calibration according to the flowchart shown in FIG. Since this is the same as the execution operation, the explanation will be omitted.
  • the receiving sensitivity determination unit 43 detects the deterioration of the receiving sensitivity of the reflected wave from the finger by the transparent antenna 2k , and when the receiving sensitivity becomes below the set value, the teacher information S(t, g, h
  • the reception sensitivity determination unit 43 uses the reception information S(t, g, h) acquired by the reception information acquisition unit 41, in this example, (N ⁇ N) pieces of reception information S(t, g, h) per one transmission cycle c. ) to obtain.
  • the reception sensitivity determination unit 43 detects the strength of the reception signal acquired from the transparent antenna 2 k included in the reception information S (t, g, h) that has been acquired, and determines the strength of the reception signal based on the detected strength of the reception signal.
  • the teacher information generation unit 423 and the control unit 50 generate reception sensitivity deterioration information that displays an image that prompts the reacquisition of the teacher information S (t, g, h
  • the information is applied to the panel 10 to display an image on the display screen of the panel 10 that prompts the user to reacquire the teacher information.
  • the reception sensitivity determination section 43 may provide the reception sensitivity deterioration information to the panel 10 via the control section 50 without going through the teacher information generation section 423.
  • the comparison between the strength of the received signal and the set value in the reception sensitivity determination section 43 is performed, for example, as follows.
  • the maximum value of the strength of the received signal included in each of the (N ⁇ N) received information S (t, g, h) per one transmission cycle c is detected, and the maximum value of the strength of the detected received signal is set. Compare values.
  • Transmission waves are radiated from the transparent antenna 2k located near the position of the finger on the display screen of the panel 10, and the transparent antenna 2k located near the position of the finger on the display screen of the panel 10 is emitted from the finger.
  • h is the maximum value of the strength of the received signal.
  • the maximum value of the strength of the received signal obtained based on the (N ⁇ N) pieces of received information S(t, g, h) becomes less than the set value, it means that the display screen of the panel 10 where the finger is present. It can be presumed that dirt has adhered or cracks have occurred on the display screen of the panel 10, and as a result, it can be determined that the receiving sensitivity of the transparent antenna 2k located near the position of the finger on the display screen of the panel 10 has deteriorated.
  • the transparent antenna 2 k as a transmitting antenna located near the position where the finger is present and the transparent antenna 2 k as the receiving antenna located near the position where the finger is present are not necessarily the same.
  • the receiving sensitivity is determined by the strength of the received signal, but the spectral shape of the received signal is detected and the change in the spectral shape exceeds the set value. In other words, the spectral shape is less than or equal to the set spectral shape.
  • reception sensitivity deterioration information may be generated that prompts reacquisition of the teacher information S(t, g, h
  • the reception sensitivity determination section 43 may be incorporated into the computer together with the reception information acquisition section 41 and the machine learning section 42.
  • the reception sensitivity determination unit 43 also includes a sensor that detects the strength of the reception signal from the transparent antenna 2k , and compares the strength of the reception signal obtained by the sensor with a set value to obtain reception sensitivity deterioration information. But that's fine.
  • the user When an image prompting re-acquisition of teacher information is displayed on the display screen of the panel 10, the user causes the operation panel device to acquire the teacher information.
  • the calibration execution operation in the operation panel device that is, the teacher information S(t, g, h
  • When stored in the teacher information storage section 422, the teacher information S(t, g, h
  • x, y) stored in the teacher information storage section 422 can be rewritten. Therefore, the position of the finger can be estimated using the optimal teacher information S(t, g, h
  • the operation panel device according to the third embodiment has the same effects as the operation panel device according to the first embodiment. Further, in the operation panel device according to the third embodiment, when the reception sensitivity determination unit 43 determines that the reception sensitivity of the reception signals from the first transparent antenna 21 to the Nth transparent antenna 2N is equal to or less than the set value, Since reception sensitivity deterioration information is given to the panel 10 to display an image prompting the acquisition of teacher information S (t, g, h
  • Embodiment 4 An operation panel device according to Embodiment 4 will be explained using FIG. 11.
  • the operation panel device according to the fourth embodiment is different from the operation panel device according to the first embodiment in that an elapsed time determination section 44 is further provided, and other points are the same. Note that in FIG. 11, the same reference numerals as those shown in FIGS. 1 to 8 indicate the same or equivalent parts.
  • the received information S (t, g , h) may be affected, and the accuracy of machine learning performed by the position estimation unit 421 using the teacher information S(t, g, h
  • the elapsed time determination unit 44 causes the teacher information generation unit 423 to generate all the teacher information S (t, g, h
  • the position estimation operation and the calibration execution operation that is, the teacher whose position information (x, y) on the display screen of the panel 10 is associated with calibration result data.
  • x, y) is the position estimation operation according to the flowchart shown in FIG. 7 and the calibration according to the flowchart shown in FIG. Since this is the same as the execution operation, the explanation will be omitted.
  • the elapsed time determination unit 44 determines whether the teacher information S The point of encouraging reacquisition of (t, g, h
  • the elapsed time determining unit 44 receives from the teacher information generating unit 423 the time when the teacher information generating unit 423 completes the operation of generating the teacher information S (t, g, h
  • the elapsed time determination unit 44 periodically reads the current time, compares the read current time with the stored calibration completion time, and determines that the elapsed time from the calibration completion time to the current time is equal to or greater than the set time.
  • Elapsed time image information for displaying an image that prompts reacquisition of teacher information S (t, g, h
  • the elapsed time determination section 44 may provide the elapsed time image information to the panel 10 via the control section 50 without going through the teacher information generation section 423.
  • the elapsed time is obtained using the calibration completion time and the current time, but the elapsed time determination unit 44 includes a timer, the timer is set according to the calibration completion time, and the elapsed time is set by the timer.
  • elapsed time image information may be output.
  • the elapsed time determination unit 44 may be incorporated into the computer together with the reception information acquisition unit 41 and the machine learning unit 42.
  • the user When an image prompting re-acquisition of teacher information is displayed on the display screen of the panel 10, the user causes the operation panel device to acquire the teacher information.
  • the calibration execution operation in the operation panel device that is, the teacher information S(t, g, h
  • When stored in the teacher information storage section 422, the teacher information S(t, g, h
  • the elapsed time determination unit 44 receives the calibration completion time from the teacher information generation unit 423, Rewrite the calibration completion time. If the elapsed time determination unit 44 has a timer, it resets the time of the timer.
  • the operation panel device has the same effects as the operation panel device according to the first embodiment.
  • the teacher information generation unit 423 generates teacher information S (t, g, h
  • Embodiment 5 An operation panel device according to Embodiment 5 will be described using FIG. 12.
  • the operation panel device according to the fifth embodiment is different from the operation panel device according to the first embodiment in that a device temperature determination section 45 and a device temperature measurement section 46 are further provided, and other points are the same. It is. Note that in FIG. 12, the same reference numerals as those given in FIGS. 1 to 8 indicate the same or equivalent parts.
  • the teacher information generation unit 423 calculates the temperature of the device, in this example, the temperature of the panel 10, and the teacher information S(t, g, h
  • the device temperature determination section 45 displays the teacher information on the display screen of the panel 10. Prompts acquisition of information S(t, g, h
  • the position estimation operation and the calibration execution operation that is, the teacher whose calibration result data is associated with position information (x, y) on the display screen of the panel 10;
  • x, y) is the position estimation operation according to the flowchart shown in FIG. 7 and the calibration according to the flowchart shown in FIG. Since this is the same as the execution operation, the explanation will be omitted.
  • the device temperature determination unit 45 generates the teacher information S (t, g, h
  • the device temperature measuring section 46 detects the temperature of the device, in this example, the temperature of the panel 10.
  • the teacher information generation unit 423 generates the teacher information S (t, g, h
  • the device temperature determination section 45 stores the temperature of the panel 10 at the time of completion of the calibration execution operation from the teacher information generation section 423 as a set value.
  • the device temperature determination unit 45 uses the temperature of the panel 10 that is periodically acquired by the teacher information generation unit 423 as a comparison value, and compares the stored setting value and the comparison value. If the difference between the set value and the comparison value is greater than or equal to the difference set value, the device temperature determination unit 45 generates temperature image information that displays an image that prompts the reacquisition of the teacher information S (t, g, h
  • the device temperature determination section 45 acquires the temperature of the panel 10 detected by the device temperature measurement section 46 via the teacher information generation section 423, it may directly acquire it from the device temperature measurement section 46. Further, the device temperature determination section 45 may provide the temperature image information to the panel 10 via the control section 50 without going through the teacher information generation section 423.
  • the user When an image prompting re-acquisition of teacher information is displayed on the display screen of the panel 10, the user causes the operation panel device to acquire the teacher information.
  • the calibration execution operation in the operation panel device that is, the teacher information S(t, g, h
  • When stored in the teacher information storage section 422, the teacher information S(t, g, h
  • the teacher information generation unit 423 completes the operation of re-acquiring the teacher information S (t, g, h
  • the teacher information generation unit 423 rewrites the temperature to the temperature of the panel 10 acquired from the device temperature measurement unit 46 and stores it as a new setting value.
  • the operation panel device has the same effects as the operation panel device according to the first embodiment. Further, in the operation panel device according to the fifth embodiment, the device temperature determination unit 45 compares the temperature of the panel 10 with the set value of the temperature of the panel 10, and when the difference between both temperatures becomes equal to or larger than the difference set value, the temperature of the panel 10 is Since the elapsed time image information that causes the display screen to display an image that prompts the acquisition of teacher information S (t, g, h
  • any one of the receiving sensitivity determining section 43 shown in the third embodiment, the elapsed time determining section 44 shown in the fourth embodiment, or the device temperature determining section 45 shown in the fifth embodiment may be replaced with the receiving sensitivity determining section 43 shown in the third embodiment. It may be applied to such an operation panel device. Furthermore, it is possible to freely combine each embodiment, to modify any component of each embodiment, or to omit any component in each embodiment.
  • the operation panel device is suitable for a non-contact type operation panel device as a user interface used in various electronic devices having a touch display function, such as mobile terminals and portable devices. Further, the present invention is suitable for operation panel devices used in factories and other environments where users with dirty fingers tend to get dirt on the display screen of the panel.
  • Reference Signs List 10 display panel 20 transparent antenna array, 2 11 to 2 mn , 2 1 to 2 N first transparent antenna to Nth transparent antenna, 30 transmitting/receiving circuit section, 31 input/output switching section, 31 1 to 31 N first 32 signal transmitter, 321 signal generator, 322 output destination selector, 33 reception information generator, 33 1 to 33 N first signal receiver to Nth signal receiver.
  • 40 object position estimation unit 41 reception information acquisition unit, 42 machine learning unit, 421 position estimation unit, 422 teacher information storage unit, 423 teacher information generation unit, 43 reception sensitivity determination unit, 44 elapsed time determination unit, 45 device temperature determination section, 50 control section;

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Position Input By Displaying (AREA)

Abstract

Le présent dispositif de tableau d'exploitation comporte : un panneau (10) d'affichage doté d'un écran d'affichage ; une pluralité d'antennes transparentes (21) à (2N) qui sont agencées de manière bidimensionnelle sur l'écran d'affichage du panneau d'affichage ; une unité (330) de génération d'informations de réception qui génère des informations de réception sur la base de signaux de réception provenant de la pluralité d'antennes transparentes (21) à (2N), qui sont exposées à des ondes réfléchies par un objet présent sur l'écran d'affichage du panneau (10) d'affichage, les informations de réception étant associées aux antennes transparentes (21) à (2N) qui ont reçu les ondes réfléchies et correspondant à la pluralité d'antennes transparentes (21) à (2N) ; une unité (422) de stockage qui stocke des informations d'apprentissage correspondant à la pluralité d'antennes transparentes (21) à (2N) associées aux informations de position sur l'écran d'affichage du panneau (10) d'affichage ; et une unité (40) d'estimation de position d'objet qui effectue un apprentissage automatique sur les informations de réception générées par l'unité (330) de génération d'informations de réception au moyen des informations d'apprentissage stockées dans l'unité (422) de stockage, et estime la position de l'objet sur l'écran d'affichage du panneau (10) d'affichage.
PCT/JP2022/034129 2022-09-13 2022-09-13 Dispositif de tableau d'exploitation comportant une unité d'estimation de position basée sur un apprentissage automatique WO2024057375A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2022/034129 WO2024057375A1 (fr) 2022-09-13 2022-09-13 Dispositif de tableau d'exploitation comportant une unité d'estimation de position basée sur un apprentissage automatique

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2022/034129 WO2024057375A1 (fr) 2022-09-13 2022-09-13 Dispositif de tableau d'exploitation comportant une unité d'estimation de position basée sur un apprentissage automatique

Publications (1)

Publication Number Publication Date
WO2024057375A1 true WO2024057375A1 (fr) 2024-03-21

Family

ID=90274416

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2022/034129 WO2024057375A1 (fr) 2022-09-13 2022-09-13 Dispositif de tableau d'exploitation comportant une unité d'estimation de position basée sur un apprentissage automatique

Country Status (1)

Country Link
WO (1) WO2024057375A1 (fr)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170131395A1 (en) * 2014-06-25 2017-05-11 University Of Washington Devices, systems, and methods for detecting gestures using multiple antennas and/or reflections of signals transmitted by the detecting device
JP2017527146A (ja) * 2014-06-11 2017-09-14 華為技術有限公司Huawei Technologies Co.,Ltd. 感知画面、制御回路及びその制御方法、並びに感知画面装置
EP3407175A1 (fr) * 2017-05-23 2018-11-28 Vestel Elektronik Sanayi ve Ticaret A.S. Dispositifs électroniques portables et procédé de détermination d'un environnement électromagnétique à proximité d'un tel dispositif
US20190086971A1 (en) * 2017-09-21 2019-03-21 Google Llc Access to High Frame-Rate Radar Data via a Circular Buffer
JP2020144647A (ja) * 2019-03-07 2020-09-10 東洋アルミニウム株式会社 位置検出装置
JP2021504771A (ja) * 2018-08-24 2021-02-15 グーグル エルエルシーGoogle LLC レーダーシステムを備えるスマートフォン、システムおよび方法

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2017527146A (ja) * 2014-06-11 2017-09-14 華為技術有限公司Huawei Technologies Co.,Ltd. 感知画面、制御回路及びその制御方法、並びに感知画面装置
US20170131395A1 (en) * 2014-06-25 2017-05-11 University Of Washington Devices, systems, and methods for detecting gestures using multiple antennas and/or reflections of signals transmitted by the detecting device
EP3407175A1 (fr) * 2017-05-23 2018-11-28 Vestel Elektronik Sanayi ve Ticaret A.S. Dispositifs électroniques portables et procédé de détermination d'un environnement électromagnétique à proximité d'un tel dispositif
US20190086971A1 (en) * 2017-09-21 2019-03-21 Google Llc Access to High Frame-Rate Radar Data via a Circular Buffer
JP2021504771A (ja) * 2018-08-24 2021-02-15 グーグル エルエルシーGoogle LLC レーダーシステムを備えるスマートフォン、システムおよび方法
JP2020144647A (ja) * 2019-03-07 2020-09-10 東洋アルミニウム株式会社 位置検出装置

Similar Documents

Publication Publication Date Title
US11481040B2 (en) User-customizable machine-learning in radar-based gesture detection
CN107430443B (zh) 基于宽场雷达的手势识别
US9952720B2 (en) Capacitive touch screen interference detection and operation
CN102163096B (zh) 信息处理装置、信息处理方法
US7292229B2 (en) Transparent digitiser
US20110025619A1 (en) Electronic analysis circuit with modulation of scanning characteristics for passive-matrix multicontact tactile sensor
JP6216145B2 (ja) タッチパネルコントローラ及び半導体デバイス
US20070211022A1 (en) Method and device for three-dimensional sensing
US20070085828A1 (en) Ultrasonic virtual mouse
JP6806250B2 (ja) 位置測定装置、位置測定方法およびプログラム
JP2019527400A (ja) タッチセンシティブキーボード
US10775899B2 (en) Touch sensitive keyboard
CN109983427B (zh) 基于噪声估计选择相关参考
CN107430463A (zh) 用基于电容的数字化仪传感器进行的检测
WO2018085168A1 (fr) Localisation d'un stylet actif sur un capteur capacitif
CN107612595B (zh) 一种天线切换方法及移动终端
WO2024057375A1 (fr) Dispositif de tableau d'exploitation comportant une unité d'estimation de position basée sur un apprentissage automatique
CN105446563B (zh) 降低等待时间的混合感测
CN110667287B (zh) 痕迹清除方法及相关产品
JP6952753B2 (ja) アクティブペンの位置検出方法及びセンサコントローラ
TW202411826A (zh) 操作面板裝置、對象物位置估計方法、對象物位置估計用程式以及記錄媒體
US11531425B1 (en) Inter-band harmonics interference mitigation for multi-frequency-region parallel scan
KR100699670B1 (ko) 원격 입력장치를 이용하는 디스플레이 시스템
CN109391309A (zh) 一种信号传输方法、装置及终端
US11625133B2 (en) Information processing system, position indicator, and method of controlling movement of display object on display screen of information processing apparatus

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22958710

Country of ref document: EP

Kind code of ref document: A1