WO2022215346A1 - Dispositif électronique - Google Patents

Dispositif électronique Download PDF

Info

Publication number
WO2022215346A1
WO2022215346A1 PCT/JP2022/005227 JP2022005227W WO2022215346A1 WO 2022215346 A1 WO2022215346 A1 WO 2022215346A1 JP 2022005227 W JP2022005227 W JP 2022005227W WO 2022215346 A1 WO2022215346 A1 WO 2022215346A1
Authority
WO
WIPO (PCT)
Prior art keywords
operator
virtual operation
operation space
space
electronic device
Prior art date
Application number
PCT/JP2022/005227
Other languages
English (en)
Japanese (ja)
Inventor
順以 山口
Original Assignee
シャープ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by シャープ株式会社 filed Critical シャープ株式会社
Priority to JP2023512839A priority Critical patent/JPWO2022215346A1/ja
Publication of WO2022215346A1 publication Critical patent/WO2022215346A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means

Definitions

  • Embodiments according to the present invention relate to electronic devices. This application claims priority to Japanese Patent Application No. 2021-065575 filed in Japan on April 8, 2021, the contents of which are incorporated herein.
  • Cited Document 1 describes a technique that uses a small distance sensor to detect gestures of an operator in a space, and recognizes the movement as an operation for performing predetermined information processing.
  • Cited Document 1 there is a problem that, when recognizing the operator's finger motion, there is a possibility that, for example, the character key type switching motion and the actual character selection motion may be erroneously recognized. ing.
  • Cited Document 1 does not disclose anything about such problems.
  • an electronic device capable of suppressing erroneous input.
  • An electronic device capable of accepting non-contact information input by an operator, comprising: a first detection unit that detects an inclination of the electronic device with respect to a predetermined reference axis; a measurement unit that measures the distance between the operator, a second detection unit that detects the position of the operator's eyes, the inclination detected by the first detection unit, the distance measured by the measurement unit, and the second detection unit an obtaining unit that obtains a position in the virtual operation space for recognizing a non-contact information input operation by the operator based on the position of the eye detected in .
  • FIG. 1 is an external view of a smartphone according to a first embodiment
  • FIG. 1 is an external view of a smartphone according to a first embodiment
  • FIG. Sectional drawing of the smart phone which concerns on 1st Embodiment.
  • FIG. 2 is a hardware configuration diagram of a smartphone according to the first embodiment
  • FIG. 2 is a hardware configuration diagram of a smartphone according to the first embodiment
  • 3 is a functional block diagram of a CPU according to the first embodiment
  • FIG. 4 is a flowchart showing the operation of the smart phone according to the first embodiment
  • 6 is a flowchart showing the operation of the smartphone according to the modified example of the first embodiment
  • 4 is a conceptual diagram of a determination table according to the first embodiment
  • FIG. 4 is a conceptual diagram of an imaging region according to the first embodiment
  • FIG. 4 is a conceptual diagram of an imaging region according to the first embodiment
  • FIG. 4 is a conceptual diagram showing the inclination of the smartphone according to the first embodiment;
  • FIG. 4 is a conceptual diagram showing a method of determining the position of the virtual operation space in the smart phone according to the first embodiment;
  • FIG. 4 is a conceptual diagram showing a method of determining the position of the virtual operation space in the smart phone according to the first embodiment;
  • FIG. 4 is a conceptual diagram showing a method of determining the position of the virtual operation space in the smart phone according to the first embodiment;
  • FIG. 10 is a hardware configuration diagram of a smartphone according to the second embodiment;
  • FIG. 10 is a hardware configuration diagram of a smartphone according to the second embodiment;
  • the functional block diagram of CPU which concerns on 2nd Embodiment.
  • FIG. 8 is a flowchart showing the operation of the smart phone according to the second embodiment
  • FIG. 11 is a conceptual diagram showing a method of calibrating the virtual operation space in the smartphone according to the second embodiment
  • FIG. 11 is a conceptual diagram showing a method of calibrating the virtual operation space in the smartphone according to the second embodiment
  • FIG. 1 is an external perspective view of a smartphone according to this embodiment.
  • the smartphone 10 includes, for example, a power button 11, a display 12, a camera 13, a ToF (Time of Flight) sensor 14, a speaker 15, a microphone 16, and a USB (Universal Serial Bus) terminal. 17.
  • a power button 11 a display 12, a camera 13, a ToF (Time of Flight) sensor 14, a speaker 15, a microphone 16, and a USB (Universal Serial Bus) terminal. 17.
  • the power button 11 is for powering on the smartphone 10 by an operation by the operator of the smartphone 10 .
  • the display 12 has a touch panel function.
  • the display 12 displays various application screens and accepts various information input by the operator such as character input and selection operation by the operator through the touch panel function. Information such as characters and numbers can be input by the operator without touching the display 12, the details of which will be described later.
  • FIG. 1 shows, as an example, a case where a numeric keypad is displayed on the display 12 as a display example for inputting information from the user.
  • the camera 13 is an in-camera for imaging an operator or the like, for example.
  • the ToF sensor 14 measures the distance to the object using infrared light, for example. Note that the camera 13 and the ToF sensor 14 may be integrated.
  • the speaker 15 outputs voice during a call, and the microphone 16 receives the voice of the operator during a call.
  • the USB terminal 17 is used for charging the smartphone 10 and transferring information.
  • FIGS. 2A and 2B are external perspective views of the smartphone 10, similar to FIG. 1, and FIG. 2B is a cross-sectional view of the smartphone 10, showing an example in which the operator inputs numbers using the numeric keypad in a non-contact operation.
  • a virtual operation space 19 is provided above the display 12 of the smartphone 10 (that is, the space between the display 12 and the operator).
  • the virtual operation space 19 is provided at a position separated from the display 12 by a certain distance.
  • This virtual operation space is provided at an appropriate position according to various conditions. This method will be described in detail later.
  • the virtual operation space 19 may or may not actually be displayed in the space so that the operator can recognize it, in other words, it may or may not actually be visible to the operator.
  • FIG. 3A is a hardware configuration diagram showing the internal configuration of the smartphone 10 according to this embodiment.
  • the smartphone 10 includes a CPU (Central Processing Unit) 101, a ROM (Read Only Memory) 102, a RAM (Random Access Memory) 103, a gyro sensor 104, a gyro data processing unit (first detection unit) 105, a ToF sensor 106 (ToF sensor 14 described in FIG. 1), ToF data processing unit (measurement unit) 107, camera 108 (camera 13 described in FIG. 1), imaging data processing unit (second detection unit) 109, position determination unit ( An acquisition unit) 110 and an input determination unit 111 are provided.
  • a CPU Central Processing Unit
  • ROM Read Only Memory
  • RAM Random Access Memory
  • gyro sensor 104 a gyro data processing unit
  • first detection unit 105
  • ToF sensor 106 ToF sensor 14 described in FIG. 1
  • ToF data processing unit Measurement unit
  • camera 108 camera 13 described in FIG. 1
  • the CPU 101 controls the operation of the smartphone 10 as a whole.
  • Various processors can be used as the CPU 101, and the processor is not necessarily limited to a CPU.
  • the ROM 102 holds various programs and data such as a program 120 and a determination table 121 for operating the smartphone 10 .
  • a RAM 103 functions as a work area for the CPU 101 and holds various programs and data.
  • the gyro sensor 104 detects changes in the rotation angle and orientation of the smartphone 10 as angular velocity, and transfers the result to the gyro data processing unit 105 .
  • the gyro data processing unit 105 calculates, for example, the angle of the smartphone 10 with respect to the horizontal axis based on the data obtained from the gyro sensor 104 .
  • the angle of the smartphone 10 is not necessarily calculated based on the horizontal axis, and may be a predetermined reference axis or reference plane. good.
  • the ToF sensor 106 detects the distance to an object positioned on the display 12 side of the smartphone 10 and transfers the result to the ToF data processing unit 107 .
  • the ToF data processing unit 107 calculates the distance to the face, for example, the eyes of the operator of the smartphone 10 based on the data obtained from the ToF sensor 106 .
  • the camera 108 captures an image of an object located on the display 12 side of the smartphone 10 at a certain angle of view, and transfers the result to the captured data processing unit 109 .
  • the captured data processing unit 109 detects the positions of the operator's eyes captured by the camera 108 and calculates the relative positional relationship between the smartphone 10 and the operator. Further, the captured data processing unit 109 detects the position and movement of the operator's finger captured by the camera 108 .
  • the position determination unit 110 determines the position of the figure based on the angle obtained by the gyro data processing unit 105, the distance to the face obtained by the ToF data processing unit 107, the eye position obtained by the imaging data processing unit 109, and the like. Appropriate positions (coordinates) of the virtual operation space 19 described with reference to 2A and 2B are calculated.
  • the input determination unit 111 determines whether the position of the operator's finger is within the virtual operation space 19 obtained by the position determination unit 110, and whether the motion of the operator's finger is within the virtual operation space 19. determine whether Then, when it is within the virtual operation space 19, it receives an input operation by the operator based on the operation of the finger.
  • FIG. 3A is a hardware configuration diagram showing the internal configuration of smartphone 10 according to the modification of FIG. 3A
  • FIG. 3C is a functional block diagram of CPU 101 when program 120 is executed.
  • the gyro data processing unit 105, the ToF data processing unit 107, the position determination unit 110, and the input determination unit 111 described in FIG. 3A are implemented by the CPU 101 executing the program 120 instead of hardware. It may function as a processing unit.
  • the imaging data processing unit 109 is similar, but the imaging data processing unit 109 is realized by a processor (GPU (Graphical Processing Unit) 112 in the example of FIG. 3B) different from the CPU, as shown in FIG. 3B. may
  • GPU Graphics Processing Unit
  • FIG. 4A is a flowchart showing the operation of the smartphone 10.
  • FIG. A method of determining the position of the smartphone 10 in the virtual operation space 19 and a method of detecting an operator's motion in the virtual operation space 19 will be described below.
  • the gyro sensor 104 first acquires angular velocity data of the smartphone 10 (step S10).
  • the gyro data processing unit 105 detects, for example, the tilt ⁇ 1 of the smartphone 10 with respect to the horizontal axis based on the angular velocity data obtained in step S10, and stores it in the RAM 103 (step S11).
  • the ToF sensor 106 also acquires distance data to the object within the imaging range (step S12).
  • the ToF data processing unit 107 calculates the distance L to the operator's face, eg eyes, based on the distance data obtained in step S12, and stores it in the RAM 103 (step S13).
  • the camera 108 takes an image of the object within the angle of view ⁇ 2 (step S14). Based on the image data obtained in step S14, the imaging data processing unit 109 detects the operator's face, for example, the eyes, calculates the relative positional relationship between the smartphone 10 and the operator, and stores it in the RAM 103. (step S15).
  • step S10 The above steps S10, S12, and S14 are executed in parallel, for example. Then, based on ⁇ 1 obtained in step S11, the distance L to the operator's face obtained in step S13, and the relative positional relationship between the smartphone 10 and the operator's face obtained in step S15, , the position determining unit 110 calculates an appropriate position in the virtual manipulation space 19 and stores it in the RAM 103 (step S16). Calculation of the appropriate position of the virtual manipulation space 19 in step S16 can be performed using, for example, a determination table 121 prepared in advance. The determination table will be described later with reference to FIG.
  • the camera 108 captures an image of the object within the angle of view ⁇ 2 (step S17). This operation may be performed in step S14. Subsequently, the imaging data processing unit 109 determines whether or not the operator's finger is detected in the imaging data obtained in step S17 (step S18).
  • step S20 the ToF data processing unit 107 calculates the distance to the operator's finger (step S20). Then, the camera 108 transfers the imaging data obtained in step S17 (or S14) to the input determination unit 111, and the ToF data processing unit 107 transfers the calculation result in step S20 to the input determination unit 111 (step S21). .
  • the calculation result obtained in step S ⁇ b>20 may be temporarily held in the RAM 103 and read from the RAM 103 by the input determination unit 111 .
  • the input determination unit 111 determines the positions and movements of the operator's fingers. It is determined whether or not it is within the operation space 19 (step S22). If it is determined in step S22 that the operator's finger is within the virtual operation space 19 (step S23, YES), the input determination unit 111 regards the detected finger movement of the operator as an operation command to the smartphone. to decide. Then, the CPU 101 performs processing based on the operation instruction (step S24).
  • the ToF data processing unit 107 uses the imaging data acquired by the camera 108 to identify specific parts of the operator, such as the face and finer eyes. A distance may be calculated. An example of this case will be described with reference to FIG. 4B.
  • FIG. 4B is a modification of the method described with reference to FIG. 4A, and relates to the case where the ToF data processing unit 107 measures the distance to a specific site using imaging data.
  • the imaged data processing unit 109 detects, for example, eyes of the operator based on the imaged data
  • the imaged data processing unit 109 transmits data on the positions of the eyes to the ToF data processing unit 107 (step S30).
  • the ToF data processing unit 107 calculates the distance to the eye based on the data regarding the position of the eye received from the imaging data processing unit 109 (step S31).
  • the ToF data processing unit 107 may use imaging data.
  • the camera 108 may be a ToF three-dimensional image sensor, and for example, the ToF data processing unit 107 and imaging data processing unit 109 described with reference to FIG. 3A may be integrated. Any configuration is not limited as long as it can calculate the distance to a predetermined part of the operator, for example, the face, and specify the position of the predetermined part of the operator, for example, the eyes.
  • FIG. 5 is a conceptual diagram of the determination table 121.
  • the determination table 121 includes the position of the face (eg, the eyes in this example) in the imaging range of the imaging data obtained by the imaging data processing unit 109, the horizontal axis of the smartphone 10 obtained by the gyro data processing unit 105, for example. and the distance L to the operator's face (eyes in this example) obtained by the ToF data processing unit 107, information about the position (space coordinates) of the virtual operation space 19 is held.
  • the positions of the eyes in the imaging range are three regions A, B, and C in the height direction (for example, the longitudinal direction of the housing of the smartphone 10) in the imaging range 130 obtained by the camera 108. are categorized.
  • area A is the upper 1 ⁇ 3 area of the imaging range 130
  • area B is the central 1 ⁇ 3 area of the imaging range 130
  • area C is the lower 1 ⁇ 3 area of the imaging range 130 .
  • FIG. 6A is only an example, and it is not necessarily divided into three in the height direction, it may be divided into two or four or more, and it is divided into a plurality of regions in the width direction instead of the height direction. Alternatively, it may be divided in both the height direction and the width direction.
  • the angle ⁇ 1 of the smartphone 10 is, for example, the angle of the smartphone 10 with respect to the horizontal axis, as shown in FIG. 6B. More specifically, it is the angle formed between the horizontal axis and the back surface of the smartphone 10 .
  • the inclination ⁇ 1 is classified into three ranges of 0° or more and less than 30°, 30° or more and less than 60°, and 60° or more and 90° or less.
  • the range of the angle ⁇ 1 is 0° or more and 90° or less, but the range may be narrower or wider than this.
  • the reference axis for the angle ⁇ 1 is not limited to the horizontal axis.
  • the distance L is the distance to the operator's face measured by the ToF sensor 106, and may be, for example, the distance to the eyes of the operator. In the example of FIG. 5, the distance L is classified into four ranges: 0 cm or more and less than 15 cm, 15 cm or more and less than 30 cm, 30 cm or more and less than 45 cm, and 45 cm or more.
  • the determination table 121 holds data on the position (spatial coordinates) of the virtual operation space 19 for each combination of conditions. For example, when the position of the eye is within the region A and the angle ⁇ 1 is within the range of 0° ⁇ 1 ⁇ 30°, the following position data is held according to the distance L. ⁇ 0cm ⁇ L ⁇ 15cm: “Aa-1” ⁇ 15cm ⁇ L ⁇ 30cm: “Aa-2” ⁇ 30cm ⁇ L ⁇ 45cm: “Aa-3” ⁇ 45 cm ⁇ L: “A-a-4” Further, when ⁇ 1 is within the range of 30° ⁇ 1 ⁇ 60°, the following position data is held according to the distance L.
  • Information indicating the position of the virtual operation space 19 is, for example, “Aa-1” indicated by double quotations in the above description. More specifically, when the display surface of the display 12 is a two-dimensional plane of the X axis and the Y axis, and the direction perpendicular to the display surface is the Z axis, "Aa-1" is the direction along the Z axis. Information such as the distance between the display 12 and the virtual operation space 19 in the virtual operation space 19, the X-axis coordinates and Y-axis coordinates of the operation surface by the operator in the virtual operation space 19, or the angle of the virtual operation space 19 with respect to the horizontal axis, for example. including.
  • the above position data includes the eyes of the operator, a predetermined position (for example, near the center) of the virtual operation space 19, and a predetermined position (for example, a numeric keypad) of the actual input screen displayed on the display 12. (near the center) is coordinate information that forms a straight line.
  • the numeric keypad 18 is displayed on the display 12 as shown in FIG.
  • the coordinate information is such that it is aligned with the area for receiving the number "5" or "8" on the ten key 18 above.
  • FIG. 2A shows the case of the numeric keypad 18, a QWERTY layout keyboard may also be used.
  • the operator's eyes, the area in the virtual manipulation space 19 that receives the keys “G”, “H”, “J”, or “K”, and the keys "G”, "H” on the display 12 , “J”, or “K” may be selected so as to be aligned with the region.
  • the screen displayed on the display 12 may allow the operator to select two areas of "YES” and “NO”, or select one area such as "OK”.
  • the virtual operation space 19 may be provided so as to be aligned with the position of the person's eyes. In other words, when the operator taps a certain area in the virtual operation space 19, coordinate information that can realize a positional relationship that allows the operator to recognize that the area intended by the operator on the display screen of the display 12 has correctly received the input. If it is By using the determination table 121 in this way, the appropriate position of the virtual manipulation space 19 can be easily obtained in step S16.
  • FIG. 7A is a cross-sectional view of the smartphone 10 according to this embodiment, showing the position of the virtual operation space 19A corresponding to the angle of the smartphone 10 with respect to the horizontal axis and the position of the operator's eyes.
  • the smartphone 10 has an angle ⁇ 1 of 30° with respect to the horizontal axis, the distance L measured by the ToF sensor 14 is 20 cm, and the operator's eye 200 is positioned at the position of the camera 13.
  • the case is shown in the area B in the imaging range 130 .
  • the position data "Bb-2" corresponds to the determination table 121 described with reference to FIG. Therefore, the position determination unit 110 determines the position of the virtual operation space 19A based on the position data "Bb-2", and the input determination unit 111 detects the operator's finger in the virtual operation space 19A.
  • the area of the operation surface in the virtual operation space 19 may be set larger than the display area of the numeric keypad 18 on the display 12 in order to facilitate key input by the operator.
  • FIG. 7B shows a case where the position of the operator's eyes 200 has moved to area C in the imaging range 130 in FIG. 7A.
  • the virtual operation space 19A determined in the case of FIG. 7A is indicated by a dashed line.
  • the distance L measured by the ToF sensor 14 is 35 cm.
  • the position data "Cb-3" corresponds to the determination table 121 described with reference to FIG. Therefore, the position determination unit 110 determines the position of the virtual operation space 19B based on the position data “Cb-3”, and the input determination unit 111 detects the operator's finger in the virtual operation space 19B. .
  • FIGS. 7A and 7B When the examples of FIGS. 7A and 7B are compared, if the virtual operation space 19A exists at the position shown in FIG. 7A regardless of the position of the operator's eyes shown in FIG. 7B, as shown in FIG. 7B
  • the operator moves his/her finger to the position of the virtual operation space 19A with the intention of tapping, for example, the central portion in the Y direction of the numeric keypad 18 on the display 12, the operator's line of sight is shifted compared to the case of FIG. 7A. Therefore, in the virtual operation space 19A, it is determined that the portion below the central portion has been tapped, which causes an erroneous input.
  • the smartphone 10 may recognize that the operator intended to select the number "4" on the numeric keypad, but actually selected the symbol "0" located below it.
  • the position of the virtual operation space 19 is changed according to the angle ⁇ 1, the distance L, and the position of the operator's eyes.
  • a virtual operation space 19B is provided at a position corresponding to the deviation of the line of sight. This can reduce the occurrence of erroneous input by the operator.
  • Fig. 7C shows another example.
  • the smartphone 10 has an angle ⁇ 1 of 60° with respect to the horizontal axis.
  • a virtual operation space 19C indicated by broken lines in FIG. 7C is provided.
  • the position of the operator's eye 200 changed, the distance L measured by the ToF sensor 14 was 45 cm, and the position of the operator's eye 200 was in the area A in the imaging range 130 of the camera 13. do.
  • the position data "Ac-4" corresponds to the determination table 121 described with reference to FIG.
  • the position determining section 110 determines the position of the virtual operation space 19D based on the position data "Ac-4". That is, since the operator looks down on the smartphone 10 from above rather than from the front, the virtual operation space 19D is also provided above the virtual operation space 19C along the Y direction according to this positional difference. In the example of FIG. 7C, the distance between the display 12 and the virtual operation space 19D and the distance between the display 12 and the virtual operation space 19C in the Z-axis direction also change. This is determined by the determination table 121, but is not limited as long as the virtual operation space 19 is provided at a position where input is easier based on the viewpoint of the operator.
  • the virtual operation space is preferably positioned at a distance of, for example, 10 cm from the smartphone 10.
  • the angle of the virtual operation space 19D with respect to the horizontal axis may also differ from that of the virtual operation space 19C. This also applies to the example of FIG. 7B.
  • FIG. 8A is a hardware configuration diagram showing the internal configuration of the smartphone 10 according to this embodiment, and corresponds to FIG. 3A described in the first embodiment.
  • the configuration according to the present embodiment differs from FIG. 3A described in the first embodiment in that the smartphone 10 further includes a calibration control unit (adjustment unit) 140 and calibration data 122 is held in the RAM 103. It is in.
  • the calibration control unit 140 calculates correction data for adjusting the position of the virtual operation space 19 determined by the method described in the first embodiment based on the parallax of the operator as described above.
  • the calibration data 122 is adjustment data for the position of the virtual operational space 19 based on the correction data calculated by the calibration control section 140 .
  • FIG. 8A may be implemented by software, as in the first embodiment.
  • 8B and 8C are respectively a hardware configuration diagram showing the internal configuration of smartphone 10 according to the modification of FIG. 3A and a functional block diagram of CPU 101 when program 120 is executed. As illustrated, the functions of the calibration control unit 140 may be realized by the CPU 101 when the program 120 is executed.
  • FIG. 9 is a flowchart showing operations of the smartphone 10 according to this embodiment. The following description will focus on the operation of adjusting the position of the virtual operation space 19 by the calibration control unit 140 .
  • the operations of FIG. 9 may be performed, for example, when the smartphone 10 is powered on for the first time by the operator, or every time the power is turned on, and/or from a setting command within the smartphone.
  • the calibration control unit 140 notifies the operator of the start of calibration by displaying a message (for example, a message such as "The input surface will be adjusted now") on the display 12. (Step S30). Subsequently, the calibration control section 140 displays data for calibration on the display 12 (step S31). Further, the calibration control unit 140 prompts the operator to adjust the angle of the smartphone 10 and the position of the operator's face to appropriate positions (step S32). Specifically, for example, the operator's eye position is on a substantially vertical line from the center of the calibration data on the display 12, in other words, the operator's line of sight faces the center of the calibration data. Then, a message for the operator is displayed on the display 12 .
  • a message for example, a message such as "The input surface will be adjusted now
  • the calibration control unit 140 determines that the angle of the smartphone 10 and the position of the operator's face are appropriate based on the detection results from the gyro sensor 104, the ToF sensor 106, and the camera 108.
  • the calibration control unit 140 prompts the operator to tap a specific portion of the data for calibration in the virtual operation space (step S33). Also at this time, the calibration control unit 140 displays a message such as, for example, please tap a specific place on the display 12 .
  • steps S10 to S21 described with reference to FIGS. 4A and 4B in the first embodiment are executed.
  • step S16 is completed at least before step S33 is executed.
  • step S33 when the operator taps the virtual operation space 19 (steps S17 to S21), the calibration control unit 140 changes the position obtained in step S16 and the position actually tapped by the operator in the virtual operation space 19. Detect the difference from the determined position. Then, this is stored in the RAM 103 as the calibration data 122 (step S34). After that, the calibration control unit 140 notifies the operator of the completion of the calibration (step S35). After that, for example, the position determination unit 110 of the smartphone 10 corrects the virtual operational space 19 obtained in step S16 using the calibration data 122, and determines the position of the virtual operational space 19 (step S36).
  • FIGS. 10A and 10B are external perspective views of the smartphone 10 during execution of the calibration operation.
  • the calibration control unit 140 displays the start of calibration on the display 12, and then displays data 20 for calibration on the display 12 as shown in FIG. 10A.
  • the letters "A" to "E” are displayed at the four corners and the center of the (3 ⁇ 3) display area, but the calibration data 20 is not limited to this example. do not have.
  • the calibration control unit 140 displays on the display 12 a message urging that the positions of the smartphone 10 and the operator are appropriate, in the example of FIG.
  • a message 21 prompting the user is displayed on the display 12 .
  • the virtual operation space 19E accepts the operator's finger motion. Note that the area of the operation surface in the virtual operation space 19 may be set larger than the display area of the calibration data 122 on the display 12 in order to facilitate key input by the operator.
  • the calibration control unit 140 recognizes that the operator has a deviation between the actual virtual operation space 19E and the position recognized by the operator due to the parallax. Therefore, the calibration control unit 140 detects the amount of deviation ⁇ between the actually tapped position and the current virtual operation space and the direction thereof. These data are stored in the RAM 103 as calibration data 122 . After that, when the smartphone 10 receives an input from the operator in the virtual operation space 19, the smartphone 10 sets the position of the virtual operation space 19 to a position shifted leftward by ⁇ from the position obtained in step S16.
  • erroneous input in the non-contact smartphone 10 can be suppressed.
  • the smart phone 10 was mentioned as an example in the said embodiment, it is widely applicable to other electronic devices.
  • it can be applied to tablet PCs, televisions, automatic ticket vending machines for trains and movies, automatic check-in machines at airports, and cash registers at restaurants and the like.
  • the angle of the display is likely to be constant in many cases. It may be the case that angle ⁇ 1 need not be taken into account in determining the position of .
  • the angle of the electronic device itself may change when, for example, it is mounted on a ship or an aircraft. preferably.
  • the case where the operator's input is performed with a finger and the finger is detected by the camera 108 has been described as an example.
  • input by the operator is not limited to a finger, and is not limited as long as it is a member capable of designating a specific area, such as a touch pen (stylus pen).
  • the captured image data processing unit 109 detects these members in advance, it may be recognized that "a finger has been detected" in step S19 in FIGS. 4A and 4B.
  • the imaging data may contain a plurality of human faces.
  • the captured data processing unit 109 may select one face using, for example, face authentication processing. That is, the imaging data processing unit 109 causes the RAM 103 to hold the face data of the operator previously photographed by the camera 13, and when a plurality of faces are recognized in step S14, the face data held in advance in the RAM 103 is stored in the RAM 103. Authentication processing may be performed using the face data, and the position of the authenticated face data may be calculated in step S15.
  • the operations described in the first and second embodiments can be implemented by executing the program 120, for example.
  • the program 120 can be downloaded via the Internet or the like and held in the ROM 102 or the RAM 103, so that after the purchase of the electronic device, the above first and It is possible to realize the operation described in the second embodiment.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

Le dispositif électronique selon le mode de réalisation est en mesure de recevoir des informations entrées par un opérateur sans contact et comprend : une première unité de détection qui détecte l'inclinaison du dispositif électronique par rapport à un axe de référence prédéfini; une unité de mesure qui mesure la distance par rapport à l'opérateur; une seconde unité de détection qui détecte la position de l'œil de l'opérateur; et une unité d'acquisition qui acquiert, sur la base de l'inclinaison détectée par la première unité de détection, de la distance mesurée par l'unité de mesure et de la position de l'œil détectée par la seconde unité de détection, la position d'un espace de fonctionnement virtuel pour reconnaître des opérations d'entrée d'informations effectuées par l'opérateur sans contact.
PCT/JP2022/005227 2021-04-08 2022-02-10 Dispositif électronique WO2022215346A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2023512839A JPWO2022215346A1 (fr) 2021-04-08 2022-02-10

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2021065575 2021-04-08
JP2021-065575 2021-04-08

Publications (1)

Publication Number Publication Date
WO2022215346A1 true WO2022215346A1 (fr) 2022-10-13

Family

ID=83546306

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2022/005227 WO2022215346A1 (fr) 2021-04-08 2022-02-10 Dispositif électronique

Country Status (2)

Country Link
JP (1) JPWO2022215346A1 (fr)
WO (1) WO2022215346A1 (fr)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011175623A (ja) * 2010-01-29 2011-09-08 Shimane Prefecture 画像認識装置および操作判定方法並びにプログラム
WO2016103521A1 (fr) * 2014-12-26 2016-06-30 株式会社ニコン Dispositif de détection et programme

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011175623A (ja) * 2010-01-29 2011-09-08 Shimane Prefecture 画像認識装置および操作判定方法並びにプログラム
WO2016103521A1 (fr) * 2014-12-26 2016-06-30 株式会社ニコン Dispositif de détection et programme

Also Published As

Publication number Publication date
JPWO2022215346A1 (fr) 2022-10-13

Similar Documents

Publication Publication Date Title
JP5802667B2 (ja) ジェスチャ入力装置およびジェスチャ入力方法
US20160299604A1 (en) Method and apparatus for controlling a mobile device based on touch operations
EP2950180B1 (fr) Procédé de détermination d'un mode d'affichage sur écran, et terminal
JP6371475B2 (ja) 視線入力装置、視線入力方法、および、視線入力プログラム
US9007321B2 (en) Method and apparatus for enlarging a display area
JP5412227B2 (ja) 映像表示装置、および、その表示制御方法
US20110187660A1 (en) Mobile type image display device, method for controlling the same and information memory medium
US9544556B2 (en) Projection control apparatus and projection control method
JP2013061848A (ja) 非接触入力装置
JP2014021596A (ja) タブレット端末、操作受付方法および操作受付プログラム
US11928291B2 (en) Image projection device
JP2008186247A (ja) 顔向き検出装置および顔向き検出方法
US20180314326A1 (en) Virtual space position designation method, system for executing the method and non-transitory computer readable medium
US9671881B2 (en) Electronic device, operation control method and recording medium
JP6428020B2 (ja) Gui装置
WO2022215346A1 (fr) Dispositif électronique
US11265460B2 (en) Electronic device, control device, and control method
JP7155242B2 (ja) 携帯情報端末
KR102278229B1 (ko) 전자기기 및 그 제어 방법
JP2013165334A (ja) 携帯端末装置
JP7069887B2 (ja) 携帯端末装置および携帯端末装置の表示制御方法
JP7035662B2 (ja) 携帯端末装置および携帯端末装置の表示制御方法
Schmieder et al. Thumbs up: 3D gesture input on mobile phones using the front facing camera
JP7179334B2 (ja) ジェスチャ認識装置及びジェスチャ認識装置用プログラム
JP7087494B2 (ja) 携帯端末装置および携帯端末装置の表示制御方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22784331

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2023512839

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 22784331

Country of ref document: EP

Kind code of ref document: A1