WO2021192841A1 - Program and electronic device - Google Patents

Program and electronic device Download PDF

Info

Publication number
WO2021192841A1
WO2021192841A1 PCT/JP2021/007856 JP2021007856W WO2021192841A1 WO 2021192841 A1 WO2021192841 A1 WO 2021192841A1 JP 2021007856 W JP2021007856 W JP 2021007856W WO 2021192841 A1 WO2021192841 A1 WO 2021192841A1
Authority
WO
WIPO (PCT)
Prior art keywords
distance
executed
length
tip
computer
Prior art date
Application number
PCT/JP2021/007856
Other languages
French (fr)
Japanese (ja)
Inventor
諒一 佐藤
Original Assignee
ブラザー工業株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ブラザー工業株式会社 filed Critical ブラザー工業株式会社
Publication of WO2021192841A1 publication Critical patent/WO2021192841A1/en

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/02Systems using reflection of radio waves, e.g. primary radar systems; Analogous systems
    • G01S13/06Systems determining position data of a target
    • G01S13/42Simultaneous measurement of distance and other co-ordinates
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range

Definitions

  • the technical fields disclosed herein relate to programs and electronic devices that measure distances between two points.
  • Patent Document 1 is a display input device that displays a preview image on a touch panel, measures the distance between two points between the touch positions of two fingers touched by the touch panel, and touches the two fingers. If the touch position of one finger moves to another position within a predetermined period, the distance between the two fingers is measured at the position after the movement, and the distance between the two points is measured first.
  • a configuration is disclosed in which a display magnification is determined based on a distance measured and a distance measured after movement, and a preview image is displayed at the determined display magnification.
  • the measurement range is limited because the measurement cannot exceed the size of the touch panel. Further, it is necessary to touch the touch panel, and there is a restriction that the distance between two points cannot be measured when the touch panel cannot detect the touch position such as wearing gloves.
  • This specification discloses a technique in which measurement is not easily restricted when measuring a distance between two points in an electronic device.
  • the program was made for the purpose of solving the above-mentioned problems, and is a program that can be executed by a computer of an electronic device equipped with an electromagnetic wave sensor, and the electromagnetic wave sensor receives electromagnetic waves having a wavelength of millimeters or smaller than millimeters.
  • a predetermined sensor indicated by two fingers on the computer based on an acquisition process of acquiring a waveform signal based on an electromagnetic wave received by the electromagnetic wave sensor and the waveform signal acquired in the acquisition process.
  • a detection process for detecting a gesture is executed, and the predetermined gesture indicates an instruction level by the distance between one tip of the two fingers and the other tip of the two fingers, and further.
  • a user measures the length of an object with two fingers and indicates the length by the distance between the tip of one finger and the tip of the other finger.
  • the computer of the electronic device measures the length indicated by the predetermined gesture, that is, the distance between two points between the tips of two fingers, and the length of the object. Can be obtained. Therefore, since the measurement is not based on the position of the touch on the electronic device, the measurement point can be detected without contact, and the size of the touch panel is not restricted, so that the measurement can be performed in a wide range.
  • the technique in which the measurement is not restricted is realized. Is realized.
  • A) It is a flowchart which shows the procedure of the process in the calibration mode among the gesture analysis processing by a gesture analysis application, and
  • B) is the external view of the electronic device which shows the display state in the calibration mode.
  • A) A flowchart showing a procedure of processing in the measurement mode mode among gesture analysis processes by the gesture analysis application, and (B) a flowchart showing a part of the procedure of the process in the measurement mode mode according to the modified example.
  • A External view of an electronic device showing a state in which a sheet object is selected by a user
  • B A label attachment object having a label attachment area to which a label paper created by using the electronic device is attached. It is an external view
  • C an external view of an electronic device in a state where a gesture is performed by a user
  • D an external view of an electronic device in a state where the length of a sheet object is changed and displayed.
  • the electronic device 1 of this embodiment includes a controller 10 including a CPU 11 and a memory 12, and can be connected to the printer 2. Further, the electronic device 1 includes a display 20, an input interface (hereinafter referred to as "input I / F") 30, a communication interface (hereinafter referred to as "communication I / F") 40, and a millimeter-wave radar 50. , Which are electrically connected to the controller 10.
  • the electronic device 1 is, for example, a device capable of executing various applications for printing on the printer 2.
  • the controller 10 in FIG. 1 is a general term for hardware and software used for controlling the electronic device 1, and does not necessarily represent a single hardware actually existing in the electronic device 1. ..
  • the CPU 11 executes various processes according to the program read from the memory 12 and based on the user's operation.
  • the CPU 11 is an example of a computer.
  • the memory 12 includes a ROM and a RAM, and further includes a non-volatile memory such as an HDD and a flash memory, and stores various programs and data.
  • the display 20 includes a display surface for displaying various functions of the electronic device 1.
  • the input I / F 30 is a key for executing each function of the electronic device 1, and is composed of a transmissive touch panel integrally provided on the surface of the display 20.
  • the electronic device 1 accepts an icon selection operation when the user touches the icon displayed on the display 20 from the input I / F 30.
  • the communication I / F40 includes hardware for communicating with an external device such as a printer 2.
  • the communication method of the communication I / F40 may be wireless or wired, or may be Wi-Fi (registered trademark), Bluetooth (registered trademark), USB, LAN or the like.
  • the electronic device 1 of this embodiment may have a function of connecting to the Internet via the communication I / F40.
  • the millimeter wave radar 50 is configured to be able to transmit millimeter waves of around 60 GHz toward a measuring object, for example, a finger of a hand, and to receive reflected waves from the measured object. Then, the millimeter wave radar 50 can output a wave-shaped output signal based on the received reflected wave.
  • the millimeter-wave radar 50 preferably has a wideband radar frequency for detecting the fine movement of the measured object.
  • the laser frequency band of 57 GHz to 64 GHz band is used, but the radar frequency band is not limited to the frequency of 57 GHz to 64 GHz band, and a wider band radar frequency may be used.
  • the radar frequency for example, a radar frequency having a wavelength in millimeters or a unit smaller than millimeters may be used.
  • the millimeter wave radar is an example of an electromagnetic wave sensor.
  • the memory 12 of the electronic device 1 of this embodiment includes an operating system (hereinafter referred to as “OS”) 41, a label creation application 42, and an image database (hereinafter referred to as “image DB”). ) 43 and the gesture analysis application 44 are incorporated.
  • OS41 is a multitasking OS capable of processing a plurality of tasks in parallel by managing and switching a plurality of tasks. For example, iOS (registered trademark), Android (registered trademark), Windows (registered trademark), macOS (registered trademark), macOS (registered trademark), Windows (registered trademark), macOS (registered trademark), and Windows (registered trademark). It is either a registered trademark) or Windows (registered trademark).
  • the printer 2 of this embodiment is, for example, a so-called label printer provided with a thermal transfer type printing head, accommodating a label paper wound in a roll shape, and printing while unwinding the label paper.
  • the printer 2 prints an image on the contained label paper and conveys the label paper based on the print job received from the electronic device 1, for example, and carries out the printed portion to the outside of the machine.
  • the label paper is an example of a printing medium.
  • the label creation application 42 of this embodiment is an application for creating various labels using the printer 2. For example, a user can use an icon (not shown) for starting the label creation application displayed on the display 20. Can be activated by touching.
  • the label creation application 42 receives instructions for creating and editing an image to be printed by the printer 2, and displays the received image on the display 20.
  • the label creation application 42 receives an instruction to execute printing of the image displayed on the display 20, generates a print job based on the displayed image, and sends it to the printer 2.
  • the label creation application 42 of this embodiment may be a program that can be executed independently based on a user's execution instruction, or a program that is called and executed from the program during the execution of another program.
  • the image DB 43 is a storage area for storing image data of various images for the label creation application 42.
  • the label creation application 42 causes the display 20 to display an image of image data stored in the image DB 43 based on a user's instruction.
  • the image data stored in the image DB 43 may be stored at all times, or may be acquired from a server or the like as needed.
  • the image DB 43 stores, for example, a plurality of templates that can be selected by the label creation application 42, and image data of a plurality of usage example images that correspond to each template and show each usage example.
  • the template used in the label creation application 42 is image data of a template for label creation, and includes samples such as a text string, a code image, a frame image, and an illustration.
  • the user can refer to a plurality of use case images, select a template similar to a label to be created from a plurality of templates, edit the selected template, and print the selected template.
  • the user can easily create a desired label paper by, for example, changing the character string of the template to a desired character and printing the template.
  • the gesture analysis application 44 is a kind of application software. As an example, the output signal output from the millimeter wave radar 50 is analyzed to detect two fingers, and the tip of one of the detected fingers and the other are detected. Perform gesture analysis to calculate the distance between two points between the tips of the fingers. The gesture analysis application 44 is similarly activated when the label creation application 42 is activated by the user.
  • the gesture analysis application 44 analyzes the output signal output from the millimeter wave radar 50 to determine the distance between the electronic device 1 and the finger of the hand extended within the detection range of the millimeter wave radar 50. It can be calculated.
  • the gesture analysis application 44 includes a measurement mode for calculating the distance between two points and a calibration mode executed before the measurement mode.
  • the screen 20 required for the input / editing process by the label creation application 42 is displayed on the display 20. That is, as shown in FIG. 2, various input buttons such as a save button 23, a print button 24, and a cancel button 25 are displayed on the display 20. Further, the display 20 displays a label image display area 26 for displaying an image such as text input by the user. The image displayed in the label image display area 26 is also an image of various labels created by using the printer 2.
  • the virtual keyboard 21 displayed on the display 20 has a plurality of text keys 211 corresponding to each text such as characters, and a plurality of function keys 212 corresponding to each command such as return and backspace used when inputting text. Is displayed.
  • the keyboard is not limited to the virtual keyboard 21, and may be configured by, for example, a combination of a hardware keyboard or mouse that accepts input operations by the user.
  • the text key 211 portion of the virtual keyboard 21 displayed on the display 20 text information such as characters corresponding to the text key 211 is input to the electronic device 1 via the input I / F 30, and the memory 12 Is remembered in.
  • the text is displayed as the input text 27 on the sheet object 28 displayed in the label image display area 26.
  • the label image display area 26 corresponds to the label paper of the printer 2.
  • the virtual keyboard 21 is not displayed on the display 20 when the calibration mode by the gesture analysis application 44 ends. Then, when the user touches the sheet object 28 portion to select the sheet object 28, the virtual keyboard 21 is displayed on the display 20, but in FIG. 2, for convenience of explanation, the virtual keyboard 21 is displayed on the display 20. Indicates the state displayed in.
  • the electronic device 1 When the user touches the print button 24 portion displayed on the display 20, the electronic device 1 generates a print job based on the text information corresponding to the input text 27 displayed on the sheet object 28 and sends the print job to the printer 2. do.
  • the electronic device 1 displays the distance between two points between the tips of two fingers calculated by the gesture analysis application 44 in the label image display area 26 by the label creation application 42 in the longitudinal direction of the sheet object 28. It can be expanded as a length, that is, the length of label paper.
  • the electronic device 1 stores the input text 27 displayed on the sheet object 28 in the image DB 43 as input text information.
  • the label creation application 42 and the gesture analysis application 44 acquire the activation instruction information (T2, T3), respectively.
  • the gesture analysis application 44 first executes the calibration mode. That is, the gesture analysis application 44 displays the calibration image required for executing the calibration mode on the display 20 (T4).
  • the gesture analysis application 44 acquires an output signal output from the millimeter wave radar 50 (T6), and the acquired output signal is obtained. Gestures are analyzed based on (T7). After that, the gesture analysis application 44 calculates the distance between the two points between the tip of the thumb of the left hand and the tip of the index finger (T8), and stores it in the memory 12 as a reference value (T9). Then, the gesture analysis application 44 ends the calibration mode.
  • the label creation application 43 acquires the calibration mode end information (T10) and displays the screen required for the label paper input / editing process on the display 20 (T11).
  • the label creation application 43 changes the display form of the sheet object 28 displayed on the display 20 (T13). ..
  • the gesture analysis application 44 executes the measurement mode when the user selects the sheet object 28 (T12).
  • the gesture analysis application 44 acquires an output signal output from the millimeter wave radar 50 (T15), and the acquired output signal is obtained.
  • the gesture is analyzed based on the above, and two fingers are detected (T16).
  • the gesture analysis application 44 calculates the distance between the two points between the tip of the thumb of the left hand and the tip of the index finger, and displays the calculated distance between the two points on the display 20 (T17).
  • the calculated distance between the two points is stored in the memory 12 as the length of the label paper to be printed (T19), and the measurement mode is terminated.
  • T14 to T18 are processed again, and the recalculated distance between the tip of the thumb of the left hand and the tip of the index finger is displayed on the display 20. And wait for the user to select.
  • the label creation application 43 acquires the measurement mode end information (T20), and based on the label paper length information stored in the memory 12, the label image display area. The lengths of the 26 and the sheet object 28 are changed and displayed on the display 20 (T21).
  • the label creation application 42 stores the text information corresponding to the input text 27 displayed on the sheet object 28 and the memory 12 in the memory 12.
  • a print job is generated based on the length of the label paper (T23) and transmitted to the printer 2 (T24).
  • the flowchart shown in FIG. 4A corresponds to the procedure of T4 to T9 in the sequence diagram shown in FIG. Further, the flowchart shown in FIG. 4A is executed by the CPU 11 of the electronic device 1. Further, each processing step in the following flowchart basically indicates the processing of the CPU 11 according to the instructions described in each program.
  • the processing by the CPU 11 also includes hardware control using the API of the OS 41 of the electronic device 1. In this specification, the operation of the program will be described by omitting the description of the operation of the OS 41.
  • the CPU 11 first executes the calibration mode. That is, the CPU 11 displays a calibration image (see FIG. 4B) required for executing the calibration on the display 20 (step 10, hereinafter referred to as S10).
  • the calibration image includes marks A60 and marks B61 arranged at reference intervals.
  • the CPU 11 acquires an output signal output from the millimeter-wave radar 50 (S11), analyzes the gesture based on the acquired output signal, and inserts the gesture into the detection range on the millimeter-wave radar 50 based on the gesture.
  • the index finger and thumb hand are extracted (S12).
  • the "gesture” refers to the three-dimensional movement of the two fingers of the hand extended within the detection range of the millimeter-wave radar 50.
  • the CPU 11 extracts the three-dimensional movements of the two fingers in S12. Then, the CPU 11 then determines whether or not the two fingers can be detected (S13).
  • the user determines the distance between the thumb 63A of the left hand 63 and the index finger 63B so as to match the distance between the mark A60 and the mark B61 according to the calibration image displayed on the display 20 (FIG. 4 (FIG. 4). B) See).
  • the user extends the thumb 63A and the index finger 63B of the left hand 63 within the detection range of the millimeter wave radar 50 while maintaining the state (interval).
  • the positions of the thumb 63A and the index finger 63B of the left hand 63 may be in a direction that is easy for the user to instruct as long as the distance between them is maintained.
  • the position of the left hand 63 with respect to the electronic device 1 may be a position that is easy for the user to instruct as long as it is within the detection range of the millimeter wave radar 50.
  • the CPU 11 can extract the thumb 63A and the index finger 63B of the left hand 63 extended on the millimeter-wave radar 50 in S12, and determines that the two fingers could be detected in the next S13 (). S13: YES), the process proceeds to the next S14.
  • the CPU 11 calculates the distance between the tip of the thumb 63A of the extracted left hand 63 and the tip of the index finger 63B.
  • the CPU 11 stores the calculated distance in the memory 12 as a reference value indicating the reference distance between the mark A60 and the mark B61 (S15), and then the CPU 11 calibrates the display 20.
  • a calibration end message is popped up for a certain period of time instead of the motion image (S16) to notify the user of the end of the calibration mode. Then, the process of the present calibration mode is ended, and the process proceeds to the process of the measurement mode described later.
  • the process of S15 is an example of the preservation process.
  • the CPU 11 determines that the two fingers could not be detected in S13 (S13: NO), and proceeds to the next S17. ..
  • the CPU 11 detects whether or not the cancel button 25 is touched in S17.
  • the user touches the cancel button 25 it is determined that the cancel button 25 has been touched (S17: YES)
  • the display of the calibration image is stopped (S18), and then the processing of the calibration mode is terminated. , The process shifts to the measurement mode processing described later.
  • the CPU 11 determines that the cancel button 25 has not been touched (S17: NO), returns to S11, and continues to output the output signal output from the millimeter wave radar 50. get.
  • the distance between the mark A60 and the mark B61 arranged in advance at the reference length is measured as the calibration mode. Then, in the determination of the actual length in the measurement mode, the length is determined based on the result measured in the calibration mode and the measured value measured in the subsequent measurement mode, and the user's hand is determined. It is possible to eliminate the measurement error caused by the shape of the above, and to measure the length more accurately.
  • the mark A60 and the mark B61 are displayed on the calibration image to notify the user about the reference interval used in the calibration mode, but the specific dimensions are used as the reference interval.
  • the user may be notified, or the dimension desired by the user may be used as the reference interval.
  • the user prepares by widening the distance between the thumb 63A of the left hand 63 and the index finger 63B by using a ruler or the like.
  • the end of the calibration mode is notified to the user by using a pop-up image, but the user may be notified by an announcement, a notification sound, vibration, or a combination thereof. good.
  • FIG. 5A The flowchart shown in FIG. 5A corresponds to the procedures T14 to T19 in the sequence diagram shown in FIG. 3, and when the user selects the sheet object 28, it is executed by the CPU 11 of the electronic device 1. Further, each processing step in the following flowchart basically indicates the processing of the CPU 11 according to the instructions described in each program.
  • the processing by the CPU 11 also includes hardware control using the API of the OS 41 of the electronic device 1. In this specification, the operation of the program will be described by omitting the description of the operation of the OS 41.
  • the electronic device 1 displays the sheet object 28 by changing the display form from the broken line in FIG. 2 to a solid line according to the procedure of T13 shown in FIG. As a result, the electronic device 1 notifies the user that the sheet object 28 is in the selected state.
  • the method of changing the display form of the sheet object 28 when the sheet object 28 is selected by the user is to change the display form by changing the frame color or line thickness of the sheet object 28, or to change the display form of the entire sheet object 28.
  • Blink display may be used to change the display form.
  • the user touches the sheet object 28 portion to be edited to select the sheet object 28 to be edited.
  • the user determines the distance between the tip of the thumb 63A of the left hand 63 and the tip of the index finger 63B so as to match the length L of the label sticking area 71 (see FIG. 6B).
  • the user extends the thumb 63A and the index finger 63B of the left hand 63 within the detection range of the millimeter-wave radar 50 while maintaining the state (interval) (see FIG. 6C).
  • the positions of the thumb 63A and the index finger 63B of the left hand 63 were separated in the same direction as in the calibration mode, and the position of the left hand 63 with respect to the electronic device 1 was also performed in the calibration mode. By being in the same position as, the detection accuracy of the length L is improved.
  • the CPU 11 acquires the output signal output from the millimeter wave radar 50 (S20).
  • the output signal is an example of a waveform signal
  • the process of S20 is an example of an acquisition process.
  • the CPU 11 analyzes the gesture based on the acquired output signal, and detects the index finger and thumb hand extended on the millimeter wave radar 50 based on the gesture (S21).
  • the process of S21 is an example of the detection process.
  • “gesture” refers to the three-dimensional movement of the fingers of the hand extended within the detection range of the millimeter-wave radar 50.
  • the CPU 11 analyzes the three-dimensional movement of the fingers based on the output signal output from the millimeter-wave radar 50 in S21 to extract two fingers. Then, the CPU 11 next determines whether or not the two fingers can be extracted (S22). Then, when the CPU 11 determines that the two fingers can be extracted (S22: YES), the CPU 11 proceeds to the next S23. On the other hand, if it is determined that the two fingers could not be extracted (S22: NO), the process returns to S20.
  • the CPU 11 determines in S23 whether or not the state is maintained for a certain period of time.
  • the fixed time here is preferably 1 second to 2 seconds, but is not limited to that, and may be shorter or longer than that.
  • the CPU 11 determines in S23 that the state is maintained for a certain period of time. (S23: YES) Next, the CPU 11 calculates the distance between the two points between the tip of the extracted thumb 63A and the tip of the index finger 63B using the reference value measured in the calibration mode (S24).
  • the process of S24 is an example of the first measurement process.
  • the distance between the two points between the tip of the thumb 63A and the tip of the index finger 63B is calculated. Therefore, a more accurate distance can be calculated.
  • the distance between the two points between the tip of the extracted thumb 63A and the tip of the index finger 63B is calculated using the reference value measured in the calibration mode, but there is no problem in accuracy. The level may be calculated without using the reference value measured in the calibration mode.
  • the distance desired by the user that is, the distance between the two points of the thumb 63A and the index finger 63B is measured by measuring at that timing.
  • the possibility of accurate measurement increases.
  • a sound effect such as a shutter sound is sounded at the timing of measurement, the user can recognize that the measurement is completed, and the convenience is further improved.
  • the CPU 11 pops up an image 64 (see FIG. 6C) describing the calculated distance between the two points on the display 20 (S25), and then the CPU 11 is "OK" by the user. It is determined whether or not it has been selected (S26).
  • S26 is an example of the determination process.
  • the image 64 in which the calculated distance between the two points is described is displayed in a pop-up, the user can easily recognize the length of the label paper, and the convenience is further improved.
  • the CPU 11 determines that "OK” is selected in S26 (S26: YES), and then the CPU 11 determines between the calculated two points.
  • the distance is stored in the memory 12 as the length of the label paper selected by the user (S27), and the measurement mode is temporarily terminated. After that, the CPU 11 returns to S20 and detects a new gesture.
  • the label creation application 42 acquires the measurement mode end information (T20), so that the electronic device 1 stores the label paper in the memory 12. Based on the length of the label paper, if the stored length of the label paper fits in the display 20, the label image display area 26 and the sheet object 28 are displayed on the display 20 by the same length. (See FIG. 6 (D)) (T21).
  • the user can easily recognize the length of the label paper, and the convenience is further improved. do.
  • the user can easily input / edit the label paper by using the label creation application 42, which improves convenience.
  • the label image display area 26 and the sheet object 28 are displayed at the same length, but they may be enlarged or reduced at a predetermined magnification for display.
  • FIG. 5 (B) shows.
  • a step (S28) may be provided so that the measurement mode mode by the gesture analysis application 44 is started based on the user's instruction.
  • the process of S28 is an example of the reception process.
  • the user's instruction in this case is not limited to the touch to the display 20, and may be, for example, an instruction by voice or an instruction by a hard switch.
  • the calibration mode by the gesture analysis application 44 ends, the measurement mode mode by the gesture analysis application 44 is started next, but the calibration mode is omitted and the gesture analysis is performed.
  • the application 44 may be configured only in the measurement mode mode. For example, when the length of the label paper is stored in the memory 12, the calibration mode may be omitted.
  • the user measures the length L of the label sticking area 71 of the label sticking object 70 with the tip of the thumb 63A and the tip of the index finger 63B, and then with the tip of the thumb 63A.
  • the electronic device 1 has the length indicated by the predetermined gesture, that is, two.
  • the length L of the label sticking area 71 can be obtained by measuring the distance between two points between the tips of the fingers.
  • the size of the input I / F 30 (display 20) is not restricted, and the measurement point can be detected without contact. Therefore, it is possible to measure in a wider range than the size of the input I / F 30 (display 20).
  • the length L of the label sticking area 71 can be directly measured with two fingers and reflected as the length L of the sheet object 28, so that a tool such as a ruler is specially used. Instead, it is possible to easily create a label paper that reflects the length L of the label sticking area 71.
  • the above-mentioned form is configured to measure the distance between two points between the tips of two fingers and reflect it as the length L of the sheet object 28.
  • the distance between two points between the tips of two fingers and the distance between the electronic device 1 and the left hand 63 are measured, respectively, and the distance is measured according to the distance between the electronic device 1 and the left hand 63.
  • the difference is that the distance between the two points is configured to be reflected as the length of the sheet object 28 in the longitudinal direction or the size of the input text 27.
  • FIG. 7 is a flowchart showing the procedure of the processing in the measurement mode among the gesture analysis processes by the gesture analysis application 44 according to the second embodiment, and will be described in detail below with reference to this flowchart.
  • the CPU 11 determines in S22 that two fingers can be detected (S22: YES), in the next S30, the CPU 11 pops up an image indicating that the fingers are being detected on the display 20, and then displays the image on the display 20. In addition, the CPU 11 determines in S31 whether or not the state is maintained for a certain period of time.
  • the fixed time here is preferably 2 seconds to 3 seconds, but is not limited to that, and may be shorter or longer than that.
  • the CPU 11 when the user maintains the thumb 63A and the index finger 63B of the left hand 63 extended on the millimeter wave radar 50 in the same state for a certain period of time, the CPU 11 maintains the state for a certain period of time in S31. Then, the CPU 11 calculates the distance between the electronic device 1 and the tips of the extracted thumb 63A and index finger 63B (S32).
  • the process of S32 is an example of the second measurement process.
  • the distance desired by the user that is, the distance between the two points of the thumb 63A and the index finger 63B
  • the distance desired by the user is measured by measuring at that timing. Is more likely to be measured accurately.
  • the CPU 11 determines whether or not the distance between the calculated electronic device 1 and the extracted thumb 63A and index finger 63B is equal to or less than a predetermined value (S33).
  • the CPU 11 determines in S34 whether or not the selected input text 27 exists.
  • the predetermined value is, for example, 100 mm, but is not limited thereto.
  • the CPU 11 determines in S34 that the selected input text 27 exists (S34: YES), and next In S35, the CPU 11 measures the distance between the two points of the extracted tip of the thumb 63A and the tip of the index finger 63B in the calibration mode, and calculates it using the reference value stored in the memory 12 (S35). ). At this time, it may be notified that the character size of the input text 27 is being calculated.
  • the distance between the two points between the tip of the thumb 63A and the tip of the index finger 63B which is measured in the calibration mode and extracted using the reference value stored in the memory 12, is calculated. Therefore, the distance between two points can be calculated more accurately.
  • the CPU 11 stores the calculated distance between the two points in the memory 12 as the character size of the input text 27 selected by the user (S36), then returns to S20 and detects the gesture again.
  • the character size of the input text 27 is an example of a designated level.
  • the label creation application 42 acquires the measurement mode end information shown in FIG. 3 (T20), so that the label creation application 42 is a sheet object based on the character size information stored in the memory 12.
  • the character size of the input text 27 displayed on the 28 is changed and displayed.
  • the CPU 11 determines in S33 that the distance is equal to or less than the predetermined value (S33: YES). ), Next, in S37, the CPU 11 determines whether or not the selected sheet object 28 exists.
  • the CPU 11 determines in S37 that the selected sheet object 28 exists (S37: YES), and then the CPU 11 determines in S35.
  • the distance between the two points between the tip of the extracted thumb 63A and the tip of the index finger 63B is measured in the calibration mode and calculated using the reference value stored in the memory 12 (S38). At this time, it may be notified that the length of the label paper is being calculated.
  • the distance between the two points between the tip of the thumb 63A and the tip of the index finger 63B which is measured in the calibration mode and extracted using the reference value stored in the memory 12, is calculated. Therefore, for example, the distance between two points can be calculated more accurately without being affected by the variation in the shape of the user's finger.
  • the CPU 11 stores the calculated distance between the two points in the memory 12 as the length of the label paper selected by the user (S39), then returns to S20 and detects the gesture again.
  • the label creation application 42 acquires the measurement mode end information shown in FIG. 3 (T20), so that the label creation application 42 is based on the length information of the label paper stored in the memory 12. , The length of the label image display area and the sheet object 28 is changed to the length L and displayed (see FIG. 6D).
  • the CPU 11 deselects the input texture 27 and the sheet object 28 selected by the user (S41), returns to S20, and detects the gesture again.
  • the distance between the two points calculated according to the distance between the electronic device 1 and the thumb 63A and the index finger 63B extended by the user on the millimeter wave radar 50 is calculated by the sheet object 28. Since it can be reflected as the length and the character size of the input text 27, the convenience is improved.
  • the calculated distance between two points is reflected as the character size of the input text 27 according to the distance between the electronic device 1 and the thumb 63A and the index finger 63B.
  • the color of the input text 27 may be changed or the font type of the input text 27 may be changed based on the distance between the two points.
  • the calculated distance between the two points is used on the label paper. It is configured to be reflected in the length and as the character size of the input text 27 when it is equal to or more than the predetermined value.
  • the input is input.
  • the character size of the text 27 may be configured to be reflected in the length of the label paper when it is equal to or larger than a predetermined value.
  • the calculated distance between the two points is reflected in the length of the label paper, but the width of the label paper, that is, the width of the label image display area 26 and the sheet object. It may be configured so as to be reflected in the length and width of 28.
  • the thumb 63A and the index finger 63B of the left hand 63 are extended within the detection range of the millimeter wave radar 50.
  • the index finger 63B of the left hand 63 and the index finger of the right hand are extended. It does not matter if it is configured. In other words, gestures are not limited to one hand.
  • the thumb 63A or index finger 63B of the left hand 63 and the other fingers of the left hand 63 may be configured to be extended within the detection range of the millimeter wave radar 50.
  • the feet may be used instead of the hands.
  • the label creation application 42 and the gesture analysis application 44 are configured as independent applications, but the label creation application 42 is configured to have the functions of the gesture analysis application 44, that is, both. You can configure your app with a single app.
  • the gesture analysis application 44 is automatically started when the label creation application 42 is started, but both applications may be started independently. No. In this case, the icon for starting the label creation application 42 and the icon for starting the gesture analysis application 44 may be displayed on the display 20, respectively.
  • the gesture analysis application 44 may be started when an input / edit target, for example, a sheet object 28 or an input text 27 is selected by the user.
  • the plurality of processes in any plurality of steps can be arbitrarily changed in the execution order or executed in parallel as long as the processing contents do not conflict with each other.
  • the process disclosed in the embodiment may be executed by a single CPU, a plurality of CPUs, hardware such as an ASIC, or a combination thereof.
  • the process disclosed in the embodiment can be realized in various aspects such as a recording medium or a method in which a program for executing the process is recorded.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • User Interface Of Digital Computer (AREA)
  • Radar Systems Or Details Thereof (AREA)

Abstract

The present invention provides technology which reduces restrictions on measurements of a two-point distance by an electronic device. An electronic device 1 comprises a millimeter wave radar 50 and acquires an output signal based on reflected waves received by the millimeter wave radar 50. On the basis of the acquired output signal, a prescribed gesture made by two fingers is detected. Thereafter, the two-point distance between a point that is represented by the tip of one of the fingers making the detected prescribed gesture and a point that is represented the tip of the other finger is measured, and the length indicated by the prescribed gesture is determined on the basis of the measured two-point distance.

Description

プログラムおよび電子デバイスPrograms and electronic devices
 本明細書に開示される技術分野は、2点間距離を計測するプログラムおよび電子デバイスに関する。 The technical fields disclosed herein relate to programs and electronic devices that measure distances between two points.
 電子デバイスにおいて、2点間距離を計測し、計測結果に基づく処理を行う技術が知られている。例えば、特許文献1には、タッチパネルにプレビュー画像を表示する表示入力装置であって、タッチパネルにタッチされた2本の指のタッチ位置間の2点間距離を計測し、2本の指がタッチされたまま予め定められた期間内に一方の指のタッチ位置が別の位置に移動した場合に、移動後の位置でも2本の指の位置間の2点間距離を計測し、最初に計測した距離と移動後に計測した距離とに基づいて表示倍率を決定し、決定した表示倍率でプレビュー画像を表示する構成が開示されている。 In electronic devices, there is known a technology that measures the distance between two points and performs processing based on the measurement results. For example, Patent Document 1 is a display input device that displays a preview image on a touch panel, measures the distance between two points between the touch positions of two fingers touched by the touch panel, and touches the two fingers. If the touch position of one finger moves to another position within a predetermined period, the distance between the two fingers is measured at the position after the movement, and the distance between the two points is measured first. A configuration is disclosed in which a display magnification is determined based on a distance measured and a distance measured after movement, and a preview image is displayed at the determined display magnification.
特開2014-63428号公報Japanese Unexamined Patent Publication No. 2014-63428
 特許文献1のようなタッチパネルにタッチされたタッチ位置に基づく2点間距離の計測では、タッチパネルのサイズ以上の計測ができないことから、計測範囲に制約がある。また、タッチパネルにタッチする操作が必要であり、手袋をはめている等、タッチパネルがタッチ位置を検知できない場合に、2点間距離を計測できないという制約がある。 In the measurement of the distance between two points based on the touch position touched on the touch panel as in Patent Document 1, the measurement range is limited because the measurement cannot exceed the size of the touch panel. Further, it is necessary to touch the touch panel, and there is a restriction that the distance between two points cannot be measured when the touch panel cannot detect the touch position such as wearing gloves.
 本明細書は、電子デバイスにおいて2点間距離を計測する場合に、計測が制約され難い技術を開示する。 This specification discloses a technique in which measurement is not easily restricted when measuring a distance between two points in an electronic device.
 上述した課題の解決を目的としてなされたものであり、電磁波センサを備える電子デバイスのコンピュータによって実行可能なプログラムであって、前記電磁波センサは、波長がミリメートル単位あるいはミリメートルよりも小さい単位の電磁波を受信するセンサであり、前記コンピュータに、前記電磁波センサが受信した電磁波に基づく波形信号を取得する取得処理と、前記取得処理にて取得した前記波形信号に基づいて、2本の指で示される所定のジェスチャを検出する検出処理と、を実行させ、前記所定のジェスチャは、前記2本の指の一方の先端と前記2本の指の他方の先端との距離で指示レベルを示すものであり、さらに前記コンピュータに、前記検出処理にて検出した前記所定のジェスチャに示される前記一方の指の先端を示す点と前記他方の指の先端を示す点との2点間距離を計測する第1計測処理と、前記第1計測処理にて計測された前記2点間距離に基づいて、前記所定のジェスチャにより指示される指示レベルを決定する決定処理と、を実行させる、ことを特徴とする。 The program was made for the purpose of solving the above-mentioned problems, and is a program that can be executed by a computer of an electronic device equipped with an electromagnetic wave sensor, and the electromagnetic wave sensor receives electromagnetic waves having a wavelength of millimeters or smaller than millimeters. A predetermined sensor indicated by two fingers on the computer based on an acquisition process of acquiring a waveform signal based on an electromagnetic wave received by the electromagnetic wave sensor and the waveform signal acquired in the acquisition process. A detection process for detecting a gesture is executed, and the predetermined gesture indicates an instruction level by the distance between one tip of the two fingers and the other tip of the two fingers, and further. The first measurement process of measuring the distance between two points of the point indicating the tip of the one finger and the point indicating the tip of the other finger shown in the predetermined gesture detected by the detection process on the computer. And the determination process of determining the instruction level instructed by the predetermined gesture based on the distance between the two points measured in the first measurement process.
 上述の構成によれば、指示レベルとして、例えばユーザは対象物の長さを、2本の指で測り、一方の指の先端と他方の指の先端との距離で長さを示す所定のジェスチャを電磁波センサの検出範囲内で行うことで、電子デバイスのコンピュータはその所定のジェスチャに示される長さ、すなわち2本の指の先端間の2点間距離を計測して、対象物の長さを取得できる。したがって、電子デバイスへのタッチの位置による計測ではないため、非接触で計測点の検知が可能であり、さらに、タッチパネルの大きさの制約も受け難く、広範囲に計測が可能となる。 According to the above configuration, as an instruction level, for example, a user measures the length of an object with two fingers and indicates the length by the distance between the tip of one finger and the tip of the other finger. By performing the above within the detection range of the electromagnetic wave sensor, the computer of the electronic device measures the length indicated by the predetermined gesture, that is, the distance between two points between the tips of two fingers, and the length of the object. Can be obtained. Therefore, since the measurement is not based on the position of the touch on the electronic device, the measurement point can be detected without contact, and the size of the touch panel is not restricted, so that the measurement can be performed in a wide range.
 上記プログラムを格納する電子デバイスにおいても、新規で有用である。 It is also new and useful in electronic devices that store the above programs.
 本明細書に開示される技術によれば、電子デバイスにおいて2点間距離を計測する場合に、計測が制約され難い技術が実現される。
が実現される。
According to the technique disclosed in the present specification, when the distance between two points is measured in an electronic device, the technique in which the measurement is not restricted is realized.
Is realized.
本発明のプログラムが組み込まれた電子デバイスの電気的構成を示すブロック図である。It is a block diagram which shows the electrical structure of the electronic device which incorporated the program of this invention. 電子デバイスにおいて、ラベル作成時の表示状態を示す外観図である。It is an external view which shows the display state at the time of making a label in an electronic device. 電子デバイスの動作の概要を示すシーケンス図である。It is a sequence diagram which shows the outline of the operation of an electronic device. (A)ジェスチャ解析アプリによるジェスチャ解析処理の内、キャリブレーションモード時の処理の手順について示すフローチャート、(B)キャリブレーションモード時の表示状態を示す電子デバイスの外観図である。(A) It is a flowchart which shows the procedure of the process in the calibration mode among the gesture analysis processing by a gesture analysis application, and (B) is the external view of the electronic device which shows the display state in the calibration mode. (A)ジェスチャ解析アプリによるジェスチャ解析処理の内、測定モードモード時の処理の手順について示すフローチャート、(B)変形例に係る測定モードモード時における処理の一部の手順について示すフローチャートである。(A) A flowchart showing a procedure of processing in the measurement mode mode among gesture analysis processes by the gesture analysis application, and (B) a flowchart showing a part of the procedure of the process in the measurement mode mode according to the modified example. (A)シートオブジェクトがユーザによって選択された状態を示す電子デバイスの外観図、(B)電子デバイスを用いて作成されるラベル紙が貼着されるラベル貼着エリアを有するラベル貼着対象物の外観図、(C)ユーザによるジェスチャが行われている状態における電子デバイスの外観図、(D)シートオブジェクトの長さが変更されて表示された状態の電子デバイスの外観図である。(A) External view of an electronic device showing a state in which a sheet object is selected by a user, (B) A label attachment object having a label attachment area to which a label paper created by using the electronic device is attached. It is an external view, (C) an external view of an electronic device in a state where a gesture is performed by a user, and (D) an external view of an electronic device in a state where the length of a sheet object is changed and displayed. 第2実施形態に係るジェスチャ解析アプリによるジェスチャ解析処理の内、測定モード時の処理の手順について示すフローチャートである。It is a flowchart which shows the procedure of the processing in the measurement mode among the gesture analysis processing by the gesture analysis application which concerns on 2nd Embodiment.
(第1実施形態)
 以下、電子デバイスに組み込まれたプログラムを具体化した実施の形態について、添付図面を参照しつつ詳細に説明する。本形態は、タブレットおよびスマートフォンなどの携帯可能であって画像を表示できる電子デバイスに組み込まれたアプリケーションプログラム(以下、「アプリ」とする)を開示するものである。
(First Embodiment)
Hereinafter, embodiments in which the program incorporated in the electronic device is embodied will be described in detail with reference to the accompanying drawings. This embodiment discloses an application program (hereinafter referred to as "application") incorporated in a portable electronic device such as a tablet or a smartphone that can display an image.
 本形態の電子デバイス1は、図1に示すように、CPU11と、メモリ12と、を含むコントローラ10を備え、プリンタ2に接続可能なものである。さらに、電子デバイス1は、ディスプレイ20と、入力インタフェース(以下、「入力I/F」とする)30と、通信インタフェース(以下、「通信I/F」とする)40と、ミリ波レーダ50と、を備え、これらがコントローラ10に電気的に接続されている。電子デバイス1は、例えば、プリンタ2に印刷させるための各種のアプリを実行可能な装置である。なお、図1中のコントローラ10は、電子デバイス1の制御に利用されるハードウェアやソフトウェアを纏めた総称であって、実際に電子デバイス1に存在する単一のハードウェアを表すとは限らない。 As shown in FIG. 1, the electronic device 1 of this embodiment includes a controller 10 including a CPU 11 and a memory 12, and can be connected to the printer 2. Further, the electronic device 1 includes a display 20, an input interface (hereinafter referred to as "input I / F") 30, a communication interface (hereinafter referred to as "communication I / F") 40, and a millimeter-wave radar 50. , Which are electrically connected to the controller 10. The electronic device 1 is, for example, a device capable of executing various applications for printing on the printer 2. The controller 10 in FIG. 1 is a general term for hardware and software used for controlling the electronic device 1, and does not necessarily represent a single hardware actually existing in the electronic device 1. ..
 CPU11は、メモリ12から読み出したプログラムに従って、また、ユーザの操作に基づいて、各種の処理を実行する。CPU11は、コンピュータの一例である。メモリ12は、ROM、RAMを含み、さらにHDD、フラッシュメモリ等の不揮発性メモリを含み、各種のプログラムやデータを記憶する。 The CPU 11 executes various processes according to the program read from the memory 12 and based on the user's operation. The CPU 11 is an example of a computer. The memory 12 includes a ROM and a RAM, and further includes a non-volatile memory such as an HDD and a flash memory, and stores various programs and data.
 ディスプレイ20は、電子デバイス1の各種機能を表示する表示面を備える。入力I/F30は、電子デバイス1の各機能を実行するためのキーであり、ディスプレイ20の表面上に一体的に設けられた透過型タッチパネルで構成される。電子デバイス1は、ディスプレイ20上に表示されたアイコンを、入力I/F30上からユーザがタッチすることで、アイコンの選択操作を受け付ける。 The display 20 includes a display surface for displaying various functions of the electronic device 1. The input I / F 30 is a key for executing each function of the electronic device 1, and is composed of a transmissive touch panel integrally provided on the surface of the display 20. The electronic device 1 accepts an icon selection operation when the user touches the icon displayed on the display 20 from the input I / F 30.
 通信I/F40は、プリンタ2等の外部装置との通信を行うためのハードウェアを含む。通信I/F40の通信方式は、無線でも有線でもよく、また、Wi-Fi(登録商標)、Bluetooth(登録商標)、USB、LAN等でもよい。なお、本形態の電子デバイス1は、通信I/F40を介して、インターネットに接続する機能を有していても良い。 The communication I / F40 includes hardware for communicating with an external device such as a printer 2. The communication method of the communication I / F40 may be wireless or wired, or may be Wi-Fi (registered trademark), Bluetooth (registered trademark), USB, LAN or the like. The electronic device 1 of this embodiment may have a function of connecting to the Internet via the communication I / F40.
 ミリ波レーダ50は、60GHz前後のミリ波を測定物、例えば、手の指に向けて送信可能であり、また、測定物からの反射波を受信可能に構成されている。そして、ミリ波レーダ50は、受信した反射波に基づいて、波形状の出力信号を出力可能である。 The millimeter wave radar 50 is configured to be able to transmit millimeter waves of around 60 GHz toward a measuring object, for example, a finger of a hand, and to receive reflected waves from the measured object. Then, the millimeter wave radar 50 can output a wave-shaped output signal based on the received reflected wave.
 ミリ波レーダ50は、測定物の細かな動作の検出には、広帯域のレーダ周波数が望ましい。本形態では、57GHz~64GHz帯のレーザ周波数の帯域を用いているが、レーダ周波数の帯域は、57GHz~64GHz帯の周波数に限定されるものではなく、もっと広い帯域のレーダ周波数を用いても差し支えない。レーダ周波数は、例えば、波長がミリメートル単位あるいはミリメートルよりも小さい単位のレーダ周波数を用いても差し支えない。なお、ミリ波レーダが電磁波センサの一例である。 The millimeter-wave radar 50 preferably has a wideband radar frequency for detecting the fine movement of the measured object. In this embodiment, the laser frequency band of 57 GHz to 64 GHz band is used, but the radar frequency band is not limited to the frequency of 57 GHz to 64 GHz band, and a wider band radar frequency may be used. No. As the radar frequency, for example, a radar frequency having a wavelength in millimeters or a unit smaller than millimeters may be used. The millimeter wave radar is an example of an electromagnetic wave sensor.
 本形態の電子デバイス1のメモリ12には、図1に示すように、オペレーティングシステム(以下、「OS」とする)41と、ラベル作成アプリ42と、画像データベース(以下、「画像DB」とする)43と、ジェスチャ解析アプリ44と、が組み込まれている。OS41は、複数のタスクを管理して切り換えることにより複数のタスクを並行して処理できるマルチタスクOSであり、例えば、iOS(登録商標)、Android(登録商標)、Windows(登録商標)、macOS(登録商標)、Linux(登録商標)のいずれかである。 As shown in FIG. 1, the memory 12 of the electronic device 1 of this embodiment includes an operating system (hereinafter referred to as “OS”) 41, a label creation application 42, and an image database (hereinafter referred to as “image DB”). ) 43 and the gesture analysis application 44 are incorporated. OS41 is a multitasking OS capable of processing a plurality of tasks in parallel by managing and switching a plurality of tasks. For example, iOS (registered trademark), Android (registered trademark), Windows (registered trademark), macOS (registered trademark), macOS (registered trademark), Windows (registered trademark), macOS (registered trademark), and Windows (registered trademark). It is either a registered trademark) or Windows (registered trademark).
 本形態のプリンタ2は、例えば、熱転写方式の印刷ヘッドを備え、ロール状に巻き取られたラベル紙を収容し、ラベル紙を巻き出しつつ印刷を行う、いわゆるラベルプリンタである。プリンタ2は、例えば、電子デバイス1から受信した印刷ジョブに基づいて、収容されているラベル紙への画像の印刷とラベル紙の搬送とを行い、印刷済みの部分を機外へ搬出させる。なお、ラベル紙が印刷媒体の一例である。 The printer 2 of this embodiment is, for example, a so-called label printer provided with a thermal transfer type printing head, accommodating a label paper wound in a roll shape, and printing while unwinding the label paper. The printer 2 prints an image on the contained label paper and conveys the label paper based on the print job received from the electronic device 1, for example, and carries out the printed portion to the outside of the machine. The label paper is an example of a printing medium.
 本形態のラベル作成アプリ42は、プリンタ2を使用して各種のラベルを作成するためのアプリであり、例えば、ディスプレイ20上に表示されたラベル作成アプリ起動用のアイコン(図示せず)をユーザがタッチすることで起動することができる。ラベル作成アプリ42は、プリンタ2に印刷させる画像の作成や編集の指示を受け付け、指示を受け付けた画像をディスプレイ20に表示させる。 The label creation application 42 of this embodiment is an application for creating various labels using the printer 2. For example, a user can use an icon (not shown) for starting the label creation application displayed on the display 20. Can be activated by touching. The label creation application 42 receives instructions for creating and editing an image to be printed by the printer 2, and displays the received image on the display 20.
 また、ラベル作成アプリ42は、ディスプレイ20上に表示中の画像の印刷実行の指示を受け付け、表示中の画像に基づく印刷ジョブを生成して、プリンタ2に送信する。なお、本形態のラベル作成アプリ42は、ユーザの実行指示に基づいて独立して実行可能なプログラムでも良いし、他のプログラムの実行中にそのプログラムから呼び出されて実行されるプログラムでも良い。 Further, the label creation application 42 receives an instruction to execute printing of the image displayed on the display 20, generates a print job based on the displayed image, and sends it to the printer 2. The label creation application 42 of this embodiment may be a program that can be executed independently based on a user's execution instruction, or a program that is called and executed from the program during the execution of another program.
 画像DB43は、ラベル作成アプリ42用の各種の画像の画像データを記憶する記憶領域である。ラベル作成アプリ42は、ユーザの指示に基づいて、画像DB43に記憶される画像データの画像をディスプレイ20に表示させる。なお、画像DB43に記憶される画像データは、常時記憶されていても良いし、必要に応じてサーバ等から取得しても良い。 The image DB 43 is a storage area for storing image data of various images for the label creation application 42. The label creation application 42 causes the display 20 to display an image of image data stored in the image DB 43 based on a user's instruction. The image data stored in the image DB 43 may be stored at all times, or may be acquired from a server or the like as needed.
 本形態では、画像DB43に、例えば、ラベル作成アプリ42にて選択可能な複数のテンプレートと、各テンプレートに対応して、それぞれの使用例を示す複数の使用例画像の画像データと、が記憶される。ラベル作成アプリ42にて使用されるテンプレートは、ラベル作成用のひな形の画像データであり、テキスト列、コード画像、枠画像、イラストなどのサンプルを含む。ユーザは、例えば、複数の使用例画像を参照して、複数のテンプレートから作成したいラベルと似たテンプレートを選択し、選択したテンプレートを編集して印刷させることができる。ユーザは、例えば、テンプレートの文字列を希望の文字に変更して印刷させることで、容易に希望のラベル紙を作成することができる。 In this embodiment, the image DB 43 stores, for example, a plurality of templates that can be selected by the label creation application 42, and image data of a plurality of usage example images that correspond to each template and show each usage example. NS. The template used in the label creation application 42 is image data of a template for label creation, and includes samples such as a text string, a code image, a frame image, and an illustration. For example, the user can refer to a plurality of use case images, select a template similar to a label to be created from a plurality of templates, edit the selected template, and print the selected template. The user can easily create a desired label paper by, for example, changing the character string of the template to a desired character and printing the template.
 ジェスチャ解析アプリ44は、一種のアプリケーションソフトであり、一例として、ミリ波レーダ50から出力される出力信号を解析して2本の指を検出し、その検出された一方の指の先端と、他方の指の先端間との2点間距離を算出するためのジェスチャ解析を行う。なお、ジェスチャ解析アプリ44は、ラベル作成アプリ42がユーザによって起動されると、同様に起動される。 The gesture analysis application 44 is a kind of application software. As an example, the output signal output from the millimeter wave radar 50 is analyzed to detect two fingers, and the tip of one of the detected fingers and the other are detected. Perform gesture analysis to calculate the distance between two points between the tips of the fingers. The gesture analysis application 44 is similarly activated when the label creation application 42 is activated by the user.
 また、ジェスチャ解析アプリ44は、ミリ波レーダ50から出力される出力信号を解析して、電子デバイス1と、ミリ波レーダ50の検出範囲内において差し出された手の指との間の距離を算出可能である。なお、ジェスチャ解析アプリ44は、一例として、2点間距離を算出するための測定モードと、測定モードの前に実行されるキャリブレーションモードとを備えている。 Further, the gesture analysis application 44 analyzes the output signal output from the millimeter wave radar 50 to determine the distance between the electronic device 1 and the finger of the hand extended within the detection range of the millimeter wave radar 50. It can be calculated. As an example, the gesture analysis application 44 includes a measurement mode for calculating the distance between two points and a calibration mode executed before the measurement mode.
 本形態の電子デバイス1では、ジェスチャ解析アプリ44によるキャリブレーションモードが終了すると、ディスプレイ20には、ラベル作成アプリ42による入力/編集処理に必要な画面が表示される。すなわち、図2に示すように、ディスプレイ20には、保存ボタン23、印刷ボタン24、キャンセルボタン25などの各種入力ボタンが表示される。また、ディスプレイ20には、ユーザによって入力されるテキスト等の画像を表示するラベル画像表示領域26が表示される。また、ラベル画像表示領域26に表示される画像は、プリンタ2を使用して作成される各種のラベルの画像でもある。 In the electronic device 1 of this embodiment, when the calibration mode by the gesture analysis application 44 ends, the screen 20 required for the input / editing process by the label creation application 42 is displayed on the display 20. That is, as shown in FIG. 2, various input buttons such as a save button 23, a print button 24, and a cancel button 25 are displayed on the display 20. Further, the display 20 displays a label image display area 26 for displaying an image such as text input by the user. The image displayed in the label image display area 26 is also an image of various labels created by using the printer 2.
 ディスプレイ20に表示された仮想キーボード21には、文字などの各テキストに対応した複数のテキストキー211、テキストの入力時に使用される、リターンおよびバックスペースなどの各コマンドに対応した複数の機能キー212が表示される。なお、キーボードは、仮想キーボード21に限定されることはなく、例えば、ユーザによる入力操作を受け付けるハード的なキーボードやマウス等の組み合わせにより構成されていても良い。 The virtual keyboard 21 displayed on the display 20 has a plurality of text keys 211 corresponding to each text such as characters, and a plurality of function keys 212 corresponding to each command such as return and backspace used when inputting text. Is displayed. The keyboard is not limited to the virtual keyboard 21, and may be configured by, for example, a combination of a hardware keyboard or mouse that accepts input operations by the user.
 そして、ディスプレイ20に表示された仮想キーボード21のテキストキー211部分をユーザがタッチすると、そのテキストキー211に対応した文字などのテキスト情報が入力I/F30を介し電子デバイス1に入力され、メモリ12に記憶される。それとともに、入力されたテキスト情報に基づいて、テキストが、ラベル画像表示領域26に表示されたシートオブジェクト28上に、入力テキスト27として表示される。ラベル画像表示領域26は、プリンタ2のラベル紙と対応している。 Then, when the user touches the text key 211 portion of the virtual keyboard 21 displayed on the display 20, text information such as characters corresponding to the text key 211 is input to the electronic device 1 via the input I / F 30, and the memory 12 Is remembered in. At the same time, based on the input text information, the text is displayed as the input text 27 on the sheet object 28 displayed in the label image display area 26. The label image display area 26 corresponds to the label paper of the printer 2.
 本形態の電子デバイス1では、仮想キーボード21は、ジェスチャ解析アプリ44によるキャリブレーションモードが終了した際に、ディスプレイ20に表示されない。そして、ユーザが、シートオブジェクト28部分をタッチしてシートオブジェクト28を選択すると、仮想キーボード21がディスプレイ20に表示されるようになるが、図2では、説明の都合上、仮想キーボード21がディスプレイ20に表示された状態を示している。 In the electronic device 1 of this embodiment, the virtual keyboard 21 is not displayed on the display 20 when the calibration mode by the gesture analysis application 44 ends. Then, when the user touches the sheet object 28 portion to select the sheet object 28, the virtual keyboard 21 is displayed on the display 20, but in FIG. 2, for convenience of explanation, the virtual keyboard 21 is displayed on the display 20. Indicates the state displayed in.
 ディスプレイ20に表示された印刷ボタン24部分をユーザがタッチすると、電子デバイス1は、シートオブジェクト28上に表示された入力テキスト27に対応したテキスト情報に基づいて印刷ジョブを生成し、プリンタ2に送信する。 When the user touches the print button 24 portion displayed on the display 20, the electronic device 1 generates a print job based on the text information corresponding to the input text 27 displayed on the sheet object 28 and sends the print job to the printer 2. do.
 また、電子デバイス1は、ジェスチャ解析アプリ44よって算出された2本の指の先端間の2点間距離を、ラベル作成アプリ42によってラベル画像表示領域26に表示されたシートオブジェクト28の長手方向の長さ、つまり、ラベル紙の長さとして展開可能である。 Further, the electronic device 1 displays the distance between two points between the tips of two fingers calculated by the gesture analysis application 44 in the label image display area 26 by the label creation application 42 in the longitudinal direction of the sheet object 28. It can be expanded as a length, that is, the length of label paper.
 その他、電子デバイス1は、ディスプレイ20に表示された保存ボタン23をユーザがタッチすると、シートオブジェクト28上に表示された入力テキスト27を入力されたテキスト情報として画像DB43に記憶する。 In addition, when the user touches the save button 23 displayed on the display 20, the electronic device 1 stores the input text 27 displayed on the sheet object 28 in the image DB 43 as input text information.
 続いて、図3に示すシーケンス図を参照して、電子デバイス1の動作の概要について、以下に説明する。 Subsequently, the outline of the operation of the electronic device 1 will be described below with reference to the sequence diagram shown in FIG.
 まず、ユーザがラベル作成アプリ起動用のアイコンをタッチすると(手順1、以下、T1と称す)、ラベル作成アプリ42およびジェスチャ解析アプリ44は、起動指示情報をそれぞれ取得する(T2、T3)。 First, when the user touches the icon for starting the label creation application (procedure 1, hereinafter referred to as T1), the label creation application 42 and the gesture analysis application 44 acquire the activation instruction information (T2, T3), respectively.
 すると、ジェスチャ解析アプリ44は、まず、キャリブレーションモードを実行する。すなわち、ジェスチャ解析アプリ44は、キャリブレーションモードを実行するために必要なキャリブレーション画像をディスプレイ20に表示する(T4)。 Then, the gesture analysis application 44 first executes the calibration mode. That is, the gesture analysis application 44 displays the calibration image required for executing the calibration mode on the display 20 (T4).
 次に、ミリ波レーダ50の検出範囲内においてユーザがジェスチャを行うと(T5)、ジェスチャ解析アプリ44は、ミリ波レーダ50から出力される出力信号を取得し(T6)、その取得した出力信号に基づいてジェスチャを解析する(T7)。その後、ジェスチャ解析アプリ44は、左手の親指の先端と人差し指の先端との2点間距離を算出し(T8)、基準値としてメモリ12に記憶する(T9)。そして、ジェスチャ解析アプリ44は、キャリブレーションモードを終了する。 Next, when the user makes a gesture within the detection range of the millimeter wave radar 50 (T5), the gesture analysis application 44 acquires an output signal output from the millimeter wave radar 50 (T6), and the acquired output signal is obtained. Gestures are analyzed based on (T7). After that, the gesture analysis application 44 calculates the distance between the two points between the tip of the thumb of the left hand and the tip of the index finger (T8), and stores it in the memory 12 as a reference value (T9). Then, the gesture analysis application 44 ends the calibration mode.
 すると、ラベル作成アプリ43は、キャリブレーションモード終了情報を取得し(T10)、ラベル紙の入力/編集処理に必要な画面をディスプレイ20に表示する(T11)。 Then, the label creation application 43 acquires the calibration mode end information (T10) and displays the screen required for the label paper input / editing process on the display 20 (T11).
 ユーザがディスプレイ20をタッチすることにより、ディスプレイ20に表示されたシートオブジェクト28を選択すると(T12)、ラベル作成アプリ43は、ディスプレイ20に表示されたシートオブジェクト28の表示形態を変更する(T13)。 When the user touches the display 20 to select the sheet object 28 displayed on the display 20 (T12), the label creation application 43 changes the display form of the sheet object 28 displayed on the display 20 (T13). ..
 この際に、ジェスチャ解析アプリ44は、ユーザがシートオブジェクト28を選択すると(T12)、測定モードを実行する。 At this time, the gesture analysis application 44 executes the measurement mode when the user selects the sheet object 28 (T12).
 ここで、ミリ波レーダ50の検出範囲内においてユーザがジェスチャを行うと(T14)、ジェスチャ解析アプリ44は、ミリ波レーダ50から出力される出力信号を取得し(T15)、その取得した出力信号に基づいてジェスチャを解析し、2本の指を検出する(T16)。その後、ジェスチャ解析アプリ44は、左手の親指の先端と人差し指の先端との2点間距離を算出して算出された2点間距離をディスプレイ20に表示する(T17)。 Here, when the user makes a gesture within the detection range of the millimeter wave radar 50 (T14), the gesture analysis application 44 acquires an output signal output from the millimeter wave radar 50 (T15), and the acquired output signal is obtained. The gesture is analyzed based on the above, and two fingers are detected (T16). After that, the gesture analysis application 44 calculates the distance between the two points between the tip of the thumb of the left hand and the tip of the index finger, and displays the calculated distance between the two points on the display 20 (T17).
 ここで、ユーザがOKを選択すると(T18:YES)、算出された2点間距離を、印刷対象のラベル紙の長さとしてメモリ12に記憶し(T19)、測定モードを終了する。 Here, when the user selects OK (T18: YES), the calculated distance between the two points is stored in the memory 12 as the length of the label paper to be printed (T19), and the measurement mode is terminated.
 一方、ユーザがOKを選択しなかった場合(T18:NO)、T14~T18を再び処理して、再び算出された左手の親指の先端と人差し指の先端との2点間距離をディスプレイ20に表示し、ユーザによる選択を待つ。 On the other hand, when the user does not select OK (T18: NO), T14 to T18 are processed again, and the recalculated distance between the tip of the thumb of the left hand and the tip of the index finger is displayed on the display 20. And wait for the user to select.
 一方、ラベル作成アプリ43は、ジェスチャ解析アプリ44が測定モードを終了すると、測定モード終了情報を取得し(T20)、メモリ12に記憶されたラベル紙の長さ情報に基づいて、ラベル画像表示領域26およびシートオブジェクト28の長さを変更してディスプレイ20に表示する(T21)。 On the other hand, when the gesture analysis application 44 ends the measurement mode, the label creation application 43 acquires the measurement mode end information (T20), and based on the label paper length information stored in the memory 12, the label image display area. The lengths of the 26 and the sheet object 28 are changed and displayed on the display 20 (T21).
 ユーザが、ディスプレイ20に表示された印刷ボタン24部分にタッチすると(T22)、ラベル作成アプリ42は、シートオブジェクト28上に表示された入力テキスト27に対応したテキスト情報、および、メモリ12に記憶されたラベル紙の長さに基づいて印刷ジョブを生成し(T23)、プリンタ2に送信する(T24)。 When the user touches the print button 24 portion displayed on the display 20 (T22), the label creation application 42 stores the text information corresponding to the input text 27 displayed on the sheet object 28 and the memory 12 in the memory 12. A print job is generated based on the length of the label paper (T23) and transmitted to the printer 2 (T24).
 続いて、本形態のジェスチャ解析アプリ44によるジェスチャ解析処理の内、キャリブレーションモード時の処理の手順について、図4(A)のフローチャートを参照して説明する。なお、図4(A)に示すフローチャートは、図3に示すシーケンス図において、T4~T9の手順に相当する。また、図4(A)に示すフローチャートは、電子デバイス1のCPU11にて実行される。また、以下のフローチャートの各処理ステップは、基本的に、各プログラムに記述された命令に従ったCPU11の処理を示す。CPU11による処理は、電子デバイス1のOS41のAPIを用いたハードウェア制御も含む。本明細書では、OS41の動作の記載を省略してプログラムの動作を説明する。 Subsequently, among the gesture analysis processes by the gesture analysis application 44 of this embodiment, the procedure of the process in the calibration mode will be described with reference to the flowchart of FIG. 4 (A). The flowchart shown in FIG. 4A corresponds to the procedure of T4 to T9 in the sequence diagram shown in FIG. Further, the flowchart shown in FIG. 4A is executed by the CPU 11 of the electronic device 1. Further, each processing step in the following flowchart basically indicates the processing of the CPU 11 according to the instructions described in each program. The processing by the CPU 11 also includes hardware control using the API of the OS 41 of the electronic device 1. In this specification, the operation of the program will be described by omitting the description of the operation of the OS 41.
 すなわち、CPU11は、まず、キャリブレーションモードを実行する。すなわち、CPU11は、キャリブレーションを実行するために必要なキャリブレーション画像(図4(B)参照)をディスプレイ20に表示する(ステップ10、以下S10と称す。)。なお、キャリブレーション画像は、基準の間隔をおいて配置されたマークA60とマークB61とを含んでいる。 That is, the CPU 11 first executes the calibration mode. That is, the CPU 11 displays a calibration image (see FIG. 4B) required for executing the calibration on the display 20 (step 10, hereinafter referred to as S10). The calibration image includes marks A60 and marks B61 arranged at reference intervals.
 次に、CPU11は、ミリ波レーダ50から出力される出力信号を取得し(S11)、その取得した出力信号に基づいてジェスチャを解析し、それに基づいて、ミリ波レーダ50上の検出範囲に差し出された人差し指と親指手を抽出する(S12)。ここで、「ジェスチャ」とは、ミリ波レーダ50の検出範囲内に差し出された手の2本の指の3次元的な動きをいう。本形態では、CPU11は、S12において、2本の指の3次元的な動きを抽出する。そして、次に、CPU11は、2本の指を検出できたか否かを判断する(S13)。 Next, the CPU 11 acquires an output signal output from the millimeter-wave radar 50 (S11), analyzes the gesture based on the acquired output signal, and inserts the gesture into the detection range on the millimeter-wave radar 50 based on the gesture. The index finger and thumb hand are extracted (S12). Here, the "gesture" refers to the three-dimensional movement of the two fingers of the hand extended within the detection range of the millimeter-wave radar 50. In this embodiment, the CPU 11 extracts the three-dimensional movements of the two fingers in S12. Then, the CPU 11 then determines whether or not the two fingers can be detected (S13).
 ここで、ユーザが、ディスプレイ20に表示されたキャリブレーション画像にしたがって、マークA60とマークB61の間隔と一致するように、例えば、左手63の親指63Aと人差し指63Bとの間隔を定める(図4(B)参照)。次に、ユーザが、その状態(間隔)を保持したままで、ミリ波レーダ50の検出範囲内に左手63の親指63Aと人差し指63Bを差し出す。この際に、左手63の親指63Aと人差し指63Bとの位置は、その間隔が保持されていれば、ユーザの指示しやすい向きであってよい。また電子デバイス1に対する左手63の位置は、ミリ波レーダ50の検出範囲内であればユーザの指示しやすい位置でよい。 Here, for example, the user determines the distance between the thumb 63A of the left hand 63 and the index finger 63B so as to match the distance between the mark A60 and the mark B61 according to the calibration image displayed on the display 20 (FIG. 4 (FIG. 4). B) See). Next, the user extends the thumb 63A and the index finger 63B of the left hand 63 within the detection range of the millimeter wave radar 50 while maintaining the state (interval). At this time, the positions of the thumb 63A and the index finger 63B of the left hand 63 may be in a direction that is easy for the user to instruct as long as the distance between them is maintained. Further, the position of the left hand 63 with respect to the electronic device 1 may be a position that is easy for the user to instruct as long as it is within the detection range of the millimeter wave radar 50.
 すると、CPU11は、S12において、ミリ波レーダ50上に差し出された左手63の親指63Aと人差し指63Bを抽出することが可能となり、次のS13において、2本の指を検出できたと判断し(S13:YES)、次のS14へ移行する。 Then, the CPU 11 can extract the thumb 63A and the index finger 63B of the left hand 63 extended on the millimeter-wave radar 50 in S12, and determines that the two fingers could be detected in the next S13 (). S13: YES), the process proceeds to the next S14.
 次に、CPU11は、S14において、抽出された左手63の親指63Aの先端と人差し指63Bの先端との距離を算出する。 Next, in S14, the CPU 11 calculates the distance between the tip of the thumb 63A of the extracted left hand 63 and the tip of the index finger 63B.
 次に、CPU11は、算出された距離を、マークA60とマークB61との間の基準の間隔を示すところの基準値としてメモリ12に記憶し(S15)、その後、CPU11は、ディスプレイ20に、キャリブレーション画像に代えてキャリブレーション終了メッセージを一定時間の間ポップアップ表示して(S16)、ユーザにキャリブレーションモードの終了を報知する。そして、本キャリブレーションモードの処理を終了し、後述する測定モードの処理に移行する。なお、S15の処理が保存処理の一例である。 Next, the CPU 11 stores the calculated distance in the memory 12 as a reference value indicating the reference distance between the mark A60 and the mark B61 (S15), and then the CPU 11 calibrates the display 20. A calibration end message is popped up for a certain period of time instead of the motion image (S16) to notify the user of the end of the calibration mode. Then, the process of the present calibration mode is ended, and the process proceeds to the process of the measurement mode described later. The process of S15 is an example of the preservation process.
 一方、ユーザが、ミリ波レーダ50の検出範囲内に手を差し出していないと、CPU11は、S13において、2本の指を検出できなかったと判断し(S13:NO)、次のS17へ移行する。 On the other hand, if the user does not extend his / her hand within the detection range of the millimeter-wave radar 50, the CPU 11 determines that the two fingers could not be detected in S13 (S13: NO), and proceeds to the next S17. ..
 次に、CPU11は、S17において、キャンセルボタン25がタッチされたか否かを検出する。ここで、ユーザがキャンセルボタン25をタッチすると、キャンセルボタン25がタッチされたと判断し(S17:YES)、キャリブレーション画像の表示を中止し(S18)、その後、本キャリブレーションモードの処理を終了し、後述する測定モードの処理に移行する。 Next, the CPU 11 detects whether or not the cancel button 25 is touched in S17. Here, when the user touches the cancel button 25, it is determined that the cancel button 25 has been touched (S17: YES), the display of the calibration image is stopped (S18), and then the processing of the calibration mode is terminated. , The process shifts to the measurement mode processing described later.
 一方、ユーザがキャンセルボタン25をタッチしていなければ、CPU11は、キャンセルボタン25がタッチされなかたと判断し(S17:NO)、S11へ戻って、引き続きミリ波レーダ50から出力される出力信号を取得する。 On the other hand, if the user has not touched the cancel button 25, the CPU 11 determines that the cancel button 25 has not been touched (S17: NO), returns to S11, and continues to output the output signal output from the millimeter wave radar 50. get.
 このように本形態では、実際の長さの測定に先立って、キャリブレーションモードとして、あらかじめ基準の長さに配置されたマークA60とマークB61との間の距離を計測する。その後、測定モードにおける実際の長さの決定では、そのキャリブレーションモードにおいて測定された結果と、その後の測定モードにおいて測定された実測値とに基づいて長さを決定するようにして、ユーザの手の形状等に起因する測定誤差を排除することができ、より正確な長さを測定できる。 As described above, in this embodiment, prior to the measurement of the actual length, the distance between the mark A60 and the mark B61 arranged in advance at the reference length is measured as the calibration mode. Then, in the determination of the actual length in the measurement mode, the length is determined based on the result measured in the calibration mode and the measured value measured in the subsequent measurement mode, and the user's hand is determined. It is possible to eliminate the measurement error caused by the shape of the above, and to measure the length more accurately.
 なお、本形態では、キャリブレーションモードにおいて使用される基準の間隔について、キャリブレーション画像上にマークA60とマークB61を表示してユーザに報知するようにしたが、具体的な寸法を基準の間隔としてユーザに報知するようにしても良いし、ユーザが希望する寸法を基準の間隔として用いるものであっても良い。この場合には、ユーザは、物差しなど用いて、左手63の親指63Aと人差し指63Bとの間隔を広げて準備することとなる。 In this embodiment, the mark A60 and the mark B61 are displayed on the calibration image to notify the user about the reference interval used in the calibration mode, but the specific dimensions are used as the reference interval. The user may be notified, or the dimension desired by the user may be used as the reference interval. In this case, the user prepares by widening the distance between the thumb 63A of the left hand 63 and the index finger 63B by using a ruler or the like.
 また、本形態では、キャリブレーションモードの終了をユーザにポップアップ画像を用いて報知するようにしたが、アナウンス、報知音、振動などによって、或いは、それらの組み合わせによって、ユーザに報知するようにしても良い。 Further, in the present embodiment, the end of the calibration mode is notified to the user by using a pop-up image, but the user may be notified by an announcement, a notification sound, vibration, or a combination thereof. good.
 続いて、本形態のジェスチャ解析アプリ44によるジェスチャ解析処理の内、測定モード時の処理の手順について、図5(A)のフローチャートを参照して説明する。なお、図5(A)に示すフローチャートは、図3に示すシーケンス図において、T14~T19の手順に相当し、ユーザがシートオブジェクト28を選択すると、電子デバイス1のCPU11によって実行される。また、以下のフローチャートの各処理ステップは、基本的に、各プログラムに記述された命令に従ったCPU11の処理を示す。CPU11による処理は、電子デバイス1のOS41のAPIを用いたハードウェア制御も含む。本明細書では、OS41の動作の記載を省略してプログラムの動作を説明する。 Subsequently, among the gesture analysis processes by the gesture analysis application 44 of this embodiment, the procedure of the process in the measurement mode will be described with reference to the flowchart of FIG. 5 (A). The flowchart shown in FIG. 5A corresponds to the procedures T14 to T19 in the sequence diagram shown in FIG. 3, and when the user selects the sheet object 28, it is executed by the CPU 11 of the electronic device 1. Further, each processing step in the following flowchart basically indicates the processing of the CPU 11 according to the instructions described in each program. The processing by the CPU 11 also includes hardware control using the API of the OS 41 of the electronic device 1. In this specification, the operation of the program will be described by omitting the description of the operation of the OS 41.
 ここでは、電子デバイス1は、図6(A)に示すように、図3に示すT13の手順によって、シートオブジェクト28を、図2の破線から実線に表示形態を変更して表示している。それにより、電子デバイス1では、シートオブジェクト28が、選択された状態にあることをユーザに報知している。なお、シートオブジェクト28がユーザによって選択された際のシートオブジェクト28の表示形態の変更方法は、シートオブジェクト28の枠の色や線の太さを変えて表示形態を変更たり、シートオブジェクト28全体をブリンク表示して表示形態を変更しても差し支えない。 Here, as shown in FIG. 6A, the electronic device 1 displays the sheet object 28 by changing the display form from the broken line in FIG. 2 to a solid line according to the procedure of T13 shown in FIG. As a result, the electronic device 1 notifies the user that the sheet object 28 is in the selected state. The method of changing the display form of the sheet object 28 when the sheet object 28 is selected by the user is to change the display form by changing the frame color or line thickness of the sheet object 28, or to change the display form of the entire sheet object 28. Blink display may be used to change the display form.
 すなわち、ユーザが、電子デバイス1を用いて、ラベル貼着対象物70のラベル貼着エリア71の長さLに合致するラベル紙を作成しようとする場合には(図6(B)参照)、まず、ユーザは、編集対象のシートオブジェクト28部分をタッチして編集対象のシートオブジェクト28を選択する。 That is, when the user intends to use the electronic device 1 to create a label paper that matches the length L of the label sticking area 71 of the label sticking object 70 (see FIG. 6B), First, the user touches the sheet object 28 portion to be edited to select the sheet object 28 to be edited.
 次に、ユーザは、ラベル貼着エリア71の長さLと一致するように、左手63の親指63Aの先端と人差し指63B先端との間隔を定める(図6(B)を参照)。次に、ユーザが、その状態(間隔)を保ったままで、ミリ波レーダ50の検出範囲内に左手63の親指63Aと人差し指63Bを差し出す(図6(C)参照)。この際に、左手63の親指63Aと人差し指63Bとの位置は、キャリブレーションモードで行ったのと同様な向きに離れていて、また電子デバイス1に対する左手63の位置もキャリブレーションモードで行ったのと同様な位置にあることで、長さLの検出精度が上がる。 Next, the user determines the distance between the tip of the thumb 63A of the left hand 63 and the tip of the index finger 63B so as to match the length L of the label sticking area 71 (see FIG. 6B). Next, the user extends the thumb 63A and the index finger 63B of the left hand 63 within the detection range of the millimeter-wave radar 50 while maintaining the state (interval) (see FIG. 6C). At this time, the positions of the thumb 63A and the index finger 63B of the left hand 63 were separated in the same direction as in the calibration mode, and the position of the left hand 63 with respect to the electronic device 1 was also performed in the calibration mode. By being in the same position as, the detection accuracy of the length L is improved.
 この状態において、CPU11は、ミリ波レーダ50から出力される出力信号を取得する(S20)。なお、出力信号が波形信号の一例であり、また、S20の処理が取得処理の一例である。 In this state, the CPU 11 acquires the output signal output from the millimeter wave radar 50 (S20). The output signal is an example of a waveform signal, and the process of S20 is an example of an acquisition process.
 次に、CPU11は、その取得した出力信号に基づいてジェスチャを解析し、それに基づいて、ミリ波レーダ50上に差し出された人差し指と親指手を検出する(S21)。なお、S21の処理が検出処理の一例である。 Next, the CPU 11 analyzes the gesture based on the acquired output signal, and detects the index finger and thumb hand extended on the millimeter wave radar 50 based on the gesture (S21). The process of S21 is an example of the detection process.
 ここで、「ジェスチャ」とは、ミリ波レーダ50の検出範囲内に差し出された手の指の3次元的な動きをいう。本形態では、CPU11は、S21において、ミリ波レーダ50から出力される出力信号に基づいて、指の3次元的な動きを解析して2本の指を抽出する。そして、次に、CPU11は、2本の指を抽出できたか否かを判断する(S22)。そして、CPU11は、2本の指を抽出できたと判断した場合には(S22:YES)、次のS23へ移行する。一方、2本の指を抽出できなかったと判断した場合には(S22:NO)、S20へ戻る。 Here, "gesture" refers to the three-dimensional movement of the fingers of the hand extended within the detection range of the millimeter-wave radar 50. In this embodiment, the CPU 11 analyzes the three-dimensional movement of the fingers based on the output signal output from the millimeter-wave radar 50 in S21 to extract two fingers. Then, the CPU 11 next determines whether or not the two fingers can be extracted (S22). Then, when the CPU 11 determines that the two fingers can be extracted (S22: YES), the CPU 11 proceeds to the next S23. On the other hand, if it is determined that the two fingers could not be extracted (S22: NO), the process returns to S20.
 次に、CPU11は、S23において、一定時間の間、その状態が維持されているか否かを判断する。なお、ここでの一定時間は、1秒~2秒が望ましいが、それに限定されることはなく、それよりも短くても長くても差し支えない。 Next, the CPU 11 determines in S23 whether or not the state is maintained for a certain period of time. The fixed time here is preferably 1 second to 2 seconds, but is not limited to that, and may be shorter or longer than that.
 ここで、ユーザが、ミリ波レーダ50上に差し出した親指63Aおよび人差し指63Bを一定時間の間、同じ状態で維持すると、CPU11は、S23において、一定時間の間、その状態が維持されたと判断し(S23:YES)、次に、CPU11は、抽出された親指63Aの先端と人差し指63Bの先端との2点間距離を、キャリブレーションモード時に測定された基準値を用いて算出する(S24)。なお、S24の処理が第1計測処理の一例である。 Here, when the user keeps the thumb 63A and the index finger 63B extended on the millimeter wave radar 50 in the same state for a certain period of time, the CPU 11 determines in S23 that the state is maintained for a certain period of time. (S23: YES) Next, the CPU 11 calculates the distance between the two points between the tip of the extracted thumb 63A and the tip of the index finger 63B using the reference value measured in the calibration mode (S24). The process of S24 is an example of the first measurement process.
 このように本形態では、キャリブレーションモード時に測定され、メモリ12に記憶された基準値を用いて抽出された親指63Aの先端と人差し指63Bの先端との2点間距離を算出するようにしているので、より正確な距離を算出することができる。なお、本形態では、抽出された親指63Aの先端と人差し指63Bの先端との2点間距離を、キャリブレーションモード時に測定された基準値を用いて算出するように構成したが、精度に問題ないレベルにおいては、キャリブレーションモード時に測定された基準値を用いることなく算出するようにしても差し支えない。 As described above, in this embodiment, the distance between the two points between the tip of the thumb 63A and the tip of the index finger 63B, which is measured in the calibration mode and extracted using the reference value stored in the memory 12, is calculated. Therefore, a more accurate distance can be calculated. In this embodiment, the distance between the two points between the tip of the extracted thumb 63A and the tip of the index finger 63B is calculated using the reference value measured in the calibration mode, but there is no problem in accuracy. The level may be calculated without using the reference value measured in the calibration mode.
 また、本形態では、親指63Aおよび人差し指63Bを同じ状態で連続して検出した場合に、そのタイミングで計測することで、ユーザが所望する距離、すなわち、親指63Aおよび人差し指63Bの2点間距離を正確に計測できる可能性が高まる。なお、同じ状態とは、厳密な意味での同じ位置に指がある必要はなく、多少の位置ずれがあってもよい。また、計測のタイミングでシャッター音等の効果音を鳴らすと、計測が完了したことをユーザが認識でき、より利便性が向上する。 Further, in the present embodiment, when the thumb 63A and the index finger 63B are continuously detected in the same state, the distance desired by the user, that is, the distance between the two points of the thumb 63A and the index finger 63B is measured by measuring at that timing. The possibility of accurate measurement increases. In the same state, it is not necessary for the fingers to be at the same position in the strict sense, and there may be some misalignment. Further, if a sound effect such as a shutter sound is sounded at the timing of measurement, the user can recognize that the measurement is completed, and the convenience is further improved.
 次に、CPU11は、ディスプレイ20に、算出された2点間距離を記載した画像64(図6(C)参照)をポップアップ表示し(S25)、次に、CPU11は、「OK」がユーザによって選択されたか否かを判断する(S26)。なお、S26の処理が決定処理の一例である。 Next, the CPU 11 pops up an image 64 (see FIG. 6C) describing the calculated distance between the two points on the display 20 (S25), and then the CPU 11 is "OK" by the user. It is determined whether or not it has been selected (S26). The process of S26 is an example of the determination process.
 このように本形態では、算出された2点間距離を記載した画像64をポップアップ表示したので、ユーザがラベル紙の長さを認識しやすく、より利便性が向上する。 As described above, in this embodiment, since the image 64 in which the calculated distance between the two points is described is displayed in a pop-up, the user can easily recognize the length of the label paper, and the convenience is further improved.
 ここで、ユーザが、画像64の「OK」部分をタッチすると、CPU11は、S26において、「OK」が選択されたと判断し(S26:YES)、次に、CPU11は、算出された2点間距離を、ユーザが選択しているラベル紙の長さとしてメモリ12に記憶し(S27)、一旦測定モードを終了する。その後、CPU11は、S20に戻り、新たなジェスチャを検出する。 Here, when the user touches the "OK" portion of the image 64, the CPU 11 determines that "OK" is selected in S26 (S26: YES), and then the CPU 11 determines between the calculated two points. The distance is stored in the memory 12 as the length of the label paper selected by the user (S27), and the measurement mode is temporarily terminated. After that, the CPU 11 returns to S20 and detects a new gesture.
 そして、電子デバイス1は、図3に示すように、ラベル紙の長さがメモリ12に記憶されると、ラベル作成アプリ42は、測定モード終了情報を取得するので(T20)、メモリ12に記憶されたラベル紙の長さに基づいて、記憶したラベル紙の長さがディスプレイ20に収まる長さであれば、等倍の長さによってディスプレイ20にラベル画像表示領域26およびシートオブジェクト28を表示する(図6(D)参照)(T21)。 Then, as shown in FIG. 3, when the length of the label paper is stored in the memory 12, the label creation application 42 acquires the measurement mode end information (T20), so that the electronic device 1 stores the label paper in the memory 12. Based on the length of the label paper, if the stored length of the label paper fits in the display 20, the label image display area 26 and the sheet object 28 are displayed on the display 20 by the same length. (See FIG. 6 (D)) (T21).
 このように本形態では、ディスプレイ20においてラベル画像表示領域26およびシートオブジェクト28を等倍の長さで表示するようにしたので、ユーザがラベル紙の長さを認識しやすく、より利便性が向上する。また、ユーザが、ラベル作成アプリ42を用いて、ラベル紙の入力/編集作業を簡単に行うことが可能となり、利便性が向上する。 As described above, in the present embodiment, since the label image display area 26 and the sheet object 28 are displayed with the same length on the display 20, the user can easily recognize the length of the label paper, and the convenience is further improved. do. In addition, the user can easily input / edit the label paper by using the label creation application 42, which improves convenience.
 なお、本形態では、ラベル画像表示領域26およびシートオブジェクト28を等倍の長さで表示するようにしたが、所定の倍率で拡大もしくは縮小して表示するようにしても差し支えない。 In this embodiment, the label image display area 26 and the sheet object 28 are displayed at the same length, but they may be enlarged or reduced at a predetermined magnification for display.
 一方、CPU11が、S22において、2本の指を検出できなかったと判断した場合(S22:NO)、S23において、一定時間の間、その状態が維持されていないと判断した場合(S23:NO)、および、S26において、「OK」が選択されなかったと判断した場合(S26:NO)には、S20に戻り、再びジェスチャを検出する。 On the other hand, when the CPU 11 determines in S22 that two fingers could not be detected (S22: NO), and in S23, when it determines that the state is not maintained for a certain period of time (S23: NO). , And, when it is determined that "OK" is not selected in S26 (S26: NO), the process returns to S20 and the gesture is detected again.
 なお、本形態では、ジェスチャ解析アプリ44によるキャリブレーションモードが終了すると、次に、自動的に、ジェスチャ解析アプリ44による測定モードモードが開始されるように構成されていたが、図5(B)に示すように、例えば、S20のステップの前に、ユーザがディスプレイ20のいずれかタッチしたか否か、もしくは、シートオブジェクト28を選択するためにシートオブジェクト28をタッチしたか否かを判断する処理ステップ(S28)を設け、ユーザの指示に基づいて、ジェスチャ解析アプリ44による測定モードモードが開始されるようにしても差し支えない。なお、S28の処理が受付処理の一例である。 In this embodiment, when the calibration mode by the gesture analysis application 44 ends, the measurement mode mode by the gesture analysis application 44 is automatically started next, but FIG. 5 (B) shows. As shown in, for example, a process of determining whether or not the user touches any of the displays 20 or whether or not the sheet object 28 is touched in order to select the sheet object 28 before the step of S20. A step (S28) may be provided so that the measurement mode mode by the gesture analysis application 44 is started based on the user's instruction. The process of S28 is an example of the reception process.
 また、この場合のユーザの指示は、ディスプレイ20へのタッチに限定されず、例えば、音声による指示やハードスイッチでの指示でも差し支えない。 Further, the user's instruction in this case is not limited to the touch to the display 20, and may be, for example, an instruction by voice or an instruction by a hard switch.
 また、本形態では、ジェスチャ解析アプリ44によるキャリブレーションモードが終了すると、次に、ジェスチャ解析アプリ44による測定モードモードが開始されるように構成されていたが、キャリブレーションモードを省略し、ジェスチャ解析アプリ44は、測定モードモードのみで構成されるものであっても差し支えない。例えば、ラベル紙の長さがメモリ12に記憶されている場合には、キャリブレーションモードを省略しても差し支えない。 Further, in this embodiment, when the calibration mode by the gesture analysis application 44 ends, the measurement mode mode by the gesture analysis application 44 is started next, but the calibration mode is omitted and the gesture analysis is performed. The application 44 may be configured only in the measurement mode mode. For example, when the length of the label paper is stored in the memory 12, the calibration mode may be omitted.
 以上詳述したように本形態では、ユーザが、ラベル貼着対象物70のラベル貼着エリア71の長さLを親指63Aの先端と人差し指63Bの先端とで測り、その後、親指63Aの先端と人差し指63Bの先端との2点間距離によって長さを示す所定のジェスチャをミリ波レーダ50の検出範囲内で行うと、電子デバイス1は、その所定のジェスチャに示される長さ、すなわち2本の指の先端間の2点間距離を計測して、ラベル貼着エリア71の長さLを取得できる。 As described in detail above, in the present embodiment, the user measures the length L of the label sticking area 71 of the label sticking object 70 with the tip of the thumb 63A and the tip of the index finger 63B, and then with the tip of the thumb 63A. When a predetermined gesture indicating the length by the distance between two points with the tip of the index finger 63B is performed within the detection range of the millimeter wave radar 50, the electronic device 1 has the length indicated by the predetermined gesture, that is, two. The length L of the label sticking area 71 can be obtained by measuring the distance between two points between the tips of the fingers.
 したがって、本形態では、電子デバイス1へのタッチの位置による計測ではないため、入力I/F30(ディスプレイ20)の大きさの制約を受け難く、また、非接触で計測点の検知が可能であるので、入力I/F30(ディスプレイ20)のサイズよりもさらに広範囲に計測が可能となる。また、本形態では、ラベル貼着エリア71の長さLを2本の指で直接測り、それをシートオブジェクト28の長さLとして反映することができるので、物差しなどの道具を特別に用いることなく、ラベル貼着エリア71の長さLを反映したラベル紙を簡単に作成することができる。 Therefore, in this embodiment, since the measurement is not based on the position of the touch on the electronic device 1, the size of the input I / F 30 (display 20) is not restricted, and the measurement point can be detected without contact. Therefore, it is possible to measure in a wider range than the size of the input I / F 30 (display 20). Further, in this embodiment, the length L of the label sticking area 71 can be directly measured with two fingers and reflected as the length L of the sheet object 28, so that a tool such as a ruler is specially used. Instead, it is possible to easily create a label paper that reflects the length L of the label sticking area 71.
(第2実施形態)
 以下、電子デバイスに組み込まれたジェスチャ解析アプリを具体化した第2実施形態について、添付図面を参照しつつ詳細に説明する。なお、その説明中、第1実施形態と同じ作用効果を奏するものには、同じ符号を付して説明する。
(Second Embodiment)
Hereinafter, the second embodiment, which embodies the gesture analysis application incorporated in the electronic device, will be described in detail with reference to the attached drawings. In the description, those having the same action and effect as those of the first embodiment will be described with the same reference numerals.
 すなわち、前述の形態は、2本の指の先端間の2点間距離を計測してシートオブジェクト28の長さLとして反映すように構成されていた。しかし、本形態では、2本の指の先端間の2点間距離と、電子デバイス1と左手63との距離とをそれぞれ計測して、電子デバイス1と左手63との距離に応じて、計測された2点間距離を、シートオブジェクト28の長手方向の長さ、或いは、入力テキスト27のサイズとして反映することができるように構成されている点で相違する。 That is, the above-mentioned form is configured to measure the distance between two points between the tips of two fingers and reflect it as the length L of the sheet object 28. However, in this embodiment, the distance between two points between the tips of two fingers and the distance between the electronic device 1 and the left hand 63 are measured, respectively, and the distance is measured according to the distance between the electronic device 1 and the left hand 63. The difference is that the distance between the two points is configured to be reflected as the length of the sheet object 28 in the longitudinal direction or the size of the input text 27.
 図7は、第2実施形態に係るジェスチャ解析アプリ44によるジェスチャ解析処理の内、測定モード時の処理の手順を示すフローチャートであり、以下、このフローチャートを参照して詳細に説明する。 FIG. 7 is a flowchart showing the procedure of the processing in the measurement mode among the gesture analysis processes by the gesture analysis application 44 according to the second embodiment, and will be described in detail below with reference to this flowchart.
 すなわち、CPU11は、S22において、2本の指を検出できたと判断した場合には(S22:YES)、次のS30において、指を検出中である旨の画像をディスプレイ20にポップアップ表示し、次に、CPU11は、S31において、一定時間の間、その状態が維持されているか否かを判断する。なお、ここでの一定時間は、2秒~3秒が望ましいが、それに限定されることはなく、それよりも短くても長くても差し支えない。 That is, when the CPU 11 determines in S22 that two fingers can be detected (S22: YES), in the next S30, the CPU 11 pops up an image indicating that the fingers are being detected on the display 20, and then displays the image on the display 20. In addition, the CPU 11 determines in S31 whether or not the state is maintained for a certain period of time. The fixed time here is preferably 2 seconds to 3 seconds, but is not limited to that, and may be shorter or longer than that.
 ここで、ユーザが、ミリ波レーダ50上に差し出した左手63の親指63Aおよび人差し指63Bを一定時間の間、同じ状態で維持すると、CPU11は、S31において、一定時間の間、その状態が維持されたと判断し(S31:YES)、次に、CPU11は、電子デバイス1と、抽出された親指63Aおよび人差し指63Bの先端との距離を算出する(S32)。なお、S32の処理が第2計測処理の一例である。 Here, when the user maintains the thumb 63A and the index finger 63B of the left hand 63 extended on the millimeter wave radar 50 in the same state for a certain period of time, the CPU 11 maintains the state for a certain period of time in S31. Then, the CPU 11 calculates the distance between the electronic device 1 and the tips of the extracted thumb 63A and index finger 63B (S32). The process of S32 is an example of the second measurement process.
 このように本形態では、親指63Aおよび人差し指63Bを同じ状態で連続して検出した場合に、そのタイミングで計測することで、ユーザが所望する距離、すなわち、親指63Aおよび人差し指63Bの2点間距離を正確に計測できる可能性が高まる。なお、同じ状態とは、厳密な意味での同じ位置に指がある必要はなく、多少の位置ずれがあってもよい。また、計測のタイミングでシャッター音等の効果音を鳴らすと、計測が完了したことをユーザが認識でき、より利便性が向上する。 As described above, in this embodiment, when the thumb 63A and the index finger 63B are continuously detected in the same state, the distance desired by the user, that is, the distance between the two points of the thumb 63A and the index finger 63B, is measured by measuring at that timing. Is more likely to be measured accurately. In the same state, it is not necessary for the fingers to be at the same position in the strict sense, and there may be some misalignment. Further, if a sound effect such as a shutter sound is sounded at the timing of measurement, the user can recognize that the measurement is completed, and the convenience is further improved.
 次に、CPU11は、算出された電子デバイス1と、抽出された親指63Aおよび人差し指63Bとの距離が所定値以下か否かを判断する(S33)。 Next, the CPU 11 determines whether or not the distance between the calculated electronic device 1 and the extracted thumb 63A and index finger 63B is equal to or less than a predetermined value (S33).
 ここで、ユーザがミリ波レーダ50上に差し出した親指63Aおよび人差し指63Bと、電子デバイス1との間の距離が所定値以下ではない場合には、CPU11は、S33において、所定値以下ではないと判断し(S33:NO)、次に、CPU11は、S34において、選択された入力テキスト27が存在するか否かを判断する。ここで所定値とは、例えば100mmが適当であるが、それに限定されるものでもない。 Here, if the distance between the thumb 63A and the index finger 63B extended by the user on the millimeter wave radar 50 and the electronic device 1 is not less than or equal to the predetermined value, the CPU 11 is not less than or equal to the predetermined value in S33. The determination (S33: NO) is made, and then the CPU 11 determines in S34 whether or not the selected input text 27 exists. Here, the predetermined value is, for example, 100 mm, but is not limited thereto.
 ここで、ユーザが、シートオブジェクト28に表示された入力テキスト27をタッチして選択していたとすると、CPU11は、S34において、選択された入力テキスト27が存在すると判断し(S34:YES)、次に、CPU11は、S35において、抽出された親指63Aの先と人差し指63Bの先との2点間距離を、キャリブレーションモード時に測定され、メモリ12に記憶された基準値を用いて算出する(S35)。このとき、入力テキスト27の文字サイズを算出していることを報知しても良い。 Here, if the user touches and selects the input text 27 displayed on the sheet object 28, the CPU 11 determines in S34 that the selected input text 27 exists (S34: YES), and next In S35, the CPU 11 measures the distance between the two points of the extracted tip of the thumb 63A and the tip of the index finger 63B in the calibration mode, and calculates it using the reference value stored in the memory 12 (S35). ). At this time, it may be notified that the character size of the input text 27 is being calculated.
 この際、本形態では、キャリブレーションモード時に測定され、メモリ12に記憶された基準値を用いて抽出された親指63Aの先端と人差し指63Bの先端との2点間距離を算出するようにしているので、より正確に2点間距離を算出することができる。 At this time, in this embodiment, the distance between the two points between the tip of the thumb 63A and the tip of the index finger 63B, which is measured in the calibration mode and extracted using the reference value stored in the memory 12, is calculated. Therefore, the distance between two points can be calculated more accurately.
 次に、CPU11は、算出された2点間距離を、ユーザが選択している入力テキスト27の文字サイズとしてメモリ12に記憶し(S36)、その後、S20に戻り、再びジェスチャを検出する。なお、入力テキスト27の文字サイズは指定レベルの一例である。 Next, the CPU 11 stores the calculated distance between the two points in the memory 12 as the character size of the input text 27 selected by the user (S36), then returns to S20 and detects the gesture again. The character size of the input text 27 is an example of a designated level.
 すると、電子デバイス1では、ラベル作成アプリ42が、図3に示す測定モード終了情報を取得するので(T20)、ラベル作成アプリ42は、メモリ12に記憶された文字サイズ情報に基づいて、シートオブジェクト28に表示された入力テキスト27の文字サイズを変更して表示する。 Then, in the electronic device 1, the label creation application 42 acquires the measurement mode end information shown in FIG. 3 (T20), so that the label creation application 42 is a sheet object based on the character size information stored in the memory 12. The character size of the input text 27 displayed on the 28 is changed and displayed.
 一方、CPU11が、S31において、一定時間の間、その状態が維持されなかったと判断した場合(S31:NO)、および、S34において、選択された入力テキスト27が存在しないと判断した場合(S34:NO)には、S20に戻り、再びジェスチャを検出する。 On the other hand, when the CPU 11 determines in S31 that the state is not maintained for a certain period of time (S31: NO), and in S34, it determines that the selected input text 27 does not exist (S34: In NO), the process returns to S20 and the gesture is detected again.
 一方、ユーザがミリ波レーダ50上に差し出した親指63Aおよび人差し指63Bと、電子デバイス1との距離が所定値以下の場合には、CPU11は、S33において、所定値以下と判断し(S33:YES)、次に、CPU11は、S37において、選択されたシートオブジェクト28が存在するか否かを判断する。 On the other hand, when the distance between the thumb 63A and the index finger 63B extended by the user on the millimeter wave radar 50 and the electronic device 1 is equal to or less than the predetermined value, the CPU 11 determines in S33 that the distance is equal to or less than the predetermined value (S33: YES). ), Next, in S37, the CPU 11 determines whether or not the selected sheet object 28 exists.
 ここで、ユーザが、シートオブジェクト28をタッチして選択していたとすると、CPU11は、S37において、選択されたシートオブジェクト28が存在すると判断し(S37:YES)、次に、CPU11は、S35において、抽出された親指63Aの先端と人差し指63Bの先端との2点間距離を、キャリブレーションモード時に測定され、メモリ12に記憶された基準値を用いて算出する(S38)。このとき、ラベル紙の長さを算出していることを報知しても良い。 Here, if the user touches and selects the sheet object 28, the CPU 11 determines in S37 that the selected sheet object 28 exists (S37: YES), and then the CPU 11 determines in S35. , The distance between the two points between the tip of the extracted thumb 63A and the tip of the index finger 63B is measured in the calibration mode and calculated using the reference value stored in the memory 12 (S38). At this time, it may be notified that the length of the label paper is being calculated.
 この際、本形態では、キャリブレーションモード時に測定され、メモリ12に記憶された基準値を用いて抽出された親指63Aの先端と人差し指63Bの先端との2点間距離を算出するようにしているので、例えば、ユーザの指の形状のばらつきの影響を受けることがなく、より正確に2点間距離を算出することができる。 At this time, in this embodiment, the distance between the two points between the tip of the thumb 63A and the tip of the index finger 63B, which is measured in the calibration mode and extracted using the reference value stored in the memory 12, is calculated. Therefore, for example, the distance between two points can be calculated more accurately without being affected by the variation in the shape of the user's finger.
 次に、CPU11は、算出された2点間距離を、ユーザが選択しているラベル紙の長さとしてメモリ12に記憶し(S39)、その後、S20に戻り、再びジェスチャを検出する。 Next, the CPU 11 stores the calculated distance between the two points in the memory 12 as the length of the label paper selected by the user (S39), then returns to S20 and detects the gesture again.
 すると、電子デバイス1では、ラベル作成アプリ42が、図3に示す測定モード終了情報を取得するので(T20)、ラベル作成アプリ42は、メモリ12に記憶されたラベル紙の長さ情報に基づいて、ラベル画像表示領域およびシートオブジェクト28の長さを長さLに変更して表示する(図6(D)参照)。 Then, in the electronic device 1, the label creation application 42 acquires the measurement mode end information shown in FIG. 3 (T20), so that the label creation application 42 is based on the length information of the label paper stored in the memory 12. , The length of the label image display area and the sheet object 28 is changed to the length L and displayed (see FIG. 6D).
 一方、CPU11が、S22において、2本の指を検出できなかったと判断した場合(S22:NO)、次のS40において、ディスプレイ20に表示された、指を検出中である旨の画像の表示を中止する。 On the other hand, when the CPU 11 determines in S22 that two fingers could not be detected (S22: NO), in the next S40, an image indicating that the fingers are being detected is displayed on the display 20. Abort.
  次に、CPU11は、ユーザが選択した、入力テキス27やシートオブジェクト28の選択を解除し(S41)、S20に戻り、再びジェスチャを検出する。 Next, the CPU 11 deselects the input texture 27 and the sheet object 28 selected by the user (S41), returns to S20, and detects the gesture again.
 以上詳述したように本形態では、電子デバイス1と、ユーザがミリ波レーダ50上に差し出した親指63Aおよび人差し指63Bとの距離に応じて、算出された2点間距離を、シートオブジェクト28の長さや入力テキス27の文字サイズとして反映することができるので、利便性が向上する。 As described in detail above, in the present embodiment, the distance between the two points calculated according to the distance between the electronic device 1 and the thumb 63A and the index finger 63B extended by the user on the millimeter wave radar 50 is calculated by the sheet object 28. Since it can be reflected as the length and the character size of the input text 27, the convenience is improved.
 なお、本形態では、電子デバイス1と、親指63Aおよび人差し指63Bとの距離に応じて、算出された2点間距離が入力テキス27の文字サイズとして反映されるように構成したが、算出された2点間距離に基づいて入力テキス27の色を変更したり、入力テキス27のフォントの種類を変更したりしても差し支えない。 In this embodiment, the calculated distance between two points is reflected as the character size of the input text 27 according to the distance between the electronic device 1 and the thumb 63A and the index finger 63B. The color of the input text 27 may be changed or the font type of the input text 27 may be changed based on the distance between the two points.
 また、本形態では、電子デバイス1と、ユーザがミリ波レーダ50上に差し出した親指63Aおよび人差し指63Bとの距離が所定値以下の場合には、算出された2点間距離を、ラベル紙の長さに、また、所定値以上の場合には、入力テキス27の文字サイズとして反映されるように構成したが、例えば、親指63Aおよび人差し指63Bとの距離が所定値以下の場合には、入力テキス27の文字サイズとして、また、所定値以上の場合には、ラベル紙の長さに反映されるように構成しても差し支えない。 Further, in the present embodiment, when the distance between the electronic device 1 and the thumb 63A and the index finger 63B extended by the user on the millimeter wave radar 50 is equal to or less than a predetermined value, the calculated distance between the two points is used on the label paper. It is configured to be reflected in the length and as the character size of the input text 27 when it is equal to or more than the predetermined value. For example, when the distance between the thumb 63A and the index finger 63B is equal to or less than the predetermined value, the input is input. The character size of the text 27 may be configured to be reflected in the length of the label paper when it is equal to or larger than a predetermined value.
 なお、本実施の形態は単なる例示にすぎず、本発明を何ら限定するものではない。したがって本発明は当然に、その要旨を逸脱しない範囲内で種々の改良、変形が可能である。 It should be noted that the present embodiment is merely an example and does not limit the present invention in any way. Therefore, as a matter of course, the present invention can be improved and modified in various ways without departing from the gist thereof.
 すなわち、本実施の形態では、算出された2点間距離を、ラベル紙の長さに反映されるように構成したが、ラベル紙の幅、すなわち、ラベル画像表示領域26の幅や、シートオブジェクト28の長さや幅に反映されるように構成しても差し支えない。 That is, in the present embodiment, the calculated distance between the two points is reflected in the length of the label paper, but the width of the label paper, that is, the width of the label image display area 26 and the sheet object. It may be configured so as to be reflected in the length and width of 28.
 また、本実施の形態では、ミリ波レーダ50の検出範囲内に、左手63の親指63Aと人差し指63Bを差し出すように構成したが、例えば、左手63の人差し指63Bと右手の人差指とを差し出すように構成しても差し支えない。つまり、ジェスチャは、1つの手によるものとは限られない。 Further, in the present embodiment, the thumb 63A and the index finger 63B of the left hand 63 are extended within the detection range of the millimeter wave radar 50. For example, the index finger 63B of the left hand 63 and the index finger of the right hand are extended. It does not matter if it is configured. In other words, gestures are not limited to one hand.
 また、左手63の親指63Aや人差し指63Bと、左手63の他の指とをミリ波レーダ50の検出範囲内に差し出すように構成しても差し支えない。また、手ではなく、例えば、足を使用しても差し支えない。 Further, the thumb 63A or index finger 63B of the left hand 63 and the other fingers of the left hand 63 may be configured to be extended within the detection range of the millimeter wave radar 50. Also, for example, the feet may be used instead of the hands.
 また、本実施の形態では、ラベル作成アプリ42とジェスチャ解析アプリ44とを、それぞれ独立したアプリとして構成したが、ラベル作成アプリ42が、ジェスチャ解析アプリ44の機能を備えるように構成、つまり、両アプリを単一のアプリで構成しても差し支えない。 Further, in the present embodiment, the label creation application 42 and the gesture analysis application 44 are configured as independent applications, but the label creation application 42 is configured to have the functions of the gesture analysis application 44, that is, both. You can configure your app with a single app.
 また、本実施の形態では、ラベル作成アプリ42が起動されると自動的にジェスチャ解析アプリ44が起動されるように構成したが、両アプリが独立して起動されるように構成しても差し支えない。この場合、ラベル作成アプリ42起動用のアイコンと、ジェスチャ解析アプリ44起動用のアイコンをそれぞれディスプレイ20に表示するようにしても差し支えない。 Further, in the present embodiment, the gesture analysis application 44 is automatically started when the label creation application 42 is started, but both applications may be started independently. No. In this case, the icon for starting the label creation application 42 and the icon for starting the gesture analysis application 44 may be displayed on the display 20, respectively.
 また、ジェスチャ解析アプリ44は、入力/編集対象、例えば、シートオブジェクト28や入力テキスト27がユーザによって選択された際に起動されるようにしても良い。 Further, the gesture analysis application 44 may be started when an input / edit target, for example, a sheet object 28 or an input text 27 is selected by the user.
 また、実施の形態に開示されている任意のフローチャートにおいて、任意の複数のステップにおける複数の処理は、処理内容に矛盾が生じない範囲で、任意に実行順序を変更できる、または並列に実行できる。 Further, in any flowchart disclosed in the embodiment, the plurality of processes in any plurality of steps can be arbitrarily changed in the execution order or executed in parallel as long as the processing contents do not conflict with each other.
 また、実施の形態に開示されている処理は、単一のCPU、複数のCPU、ASICなどのハードウェア、またはそれらの組み合わせで実行されてもよい。また、実施の形態に開示されている処理は、その処理を実行するためのプログラムを記録した記録媒体、または方法等の種々の態様で実現することができる。 Further, the process disclosed in the embodiment may be executed by a single CPU, a plurality of CPUs, hardware such as an ASIC, or a combination thereof. In addition, the process disclosed in the embodiment can be realized in various aspects such as a recording medium or a method in which a program for executing the process is recorded.
 1 電子デバイス
 2 プリンタ
 10 コントローラ
 12 メモリ
 20 ディスプレイ
 21 仮想キーボード
 26 ラベル画像表示領域
 27 入力テキスト
 28 シートオブジェクト
 30 入力I/F
 41 OS
 42 ラベル作成アプリ
 44 ジェスチャ解析アプリ
 50 ミリ波レーダ
1 Electronic device 2 Printer 10 Controller 12 Memory 20 Display 21 Virtual keyboard 26 Label image display area 27 Input text 28 Sheet object 30 Input I / F
41 OS
42 Label Creation App 44 Gesture Analysis App 50 Millimeter Wave Radar

Claims (13)

  1.  電磁波センサを備える電子デバイスのコンピュータによって実行可能なプログラムであって、前記電磁波センサは、波長がミリメートル単位あるいはミリメートルよりも小さい単位の電磁波を受信するセンサであり、
     前記コンピュータに、
      前記電磁波センサが受信した電磁波に基づく波形信号を取得する取得処理と、  前記取得処理にて取得した前記波形信号に基づいて、2本の指で示される所定のジェスチャを検出する検出処理と、を実行させ、前記所定のジェスチャは、前記2本の指の一方の先端と前記2本の指の他方の先端との距離で長さを示すものであり、
     さらに前記コンピュータに、
      前記検出処理にて検出した前記所定のジェスチャに示される前記一方の指の先端を示す点と前記他方の指の先端を示す点との2点間距離を計測する第1計測処理と、  前記第1計測処理にて計測された前記2点間距離に基づいて、前記所定のジェスチャにより指示される長さを決定する決定処理と、を実行させる、
     ことを特徴とするプログラム。
    A program that can be executed by a computer of an electronic device including an electromagnetic wave sensor, the electromagnetic wave sensor is a sensor that receives electromagnetic waves having a wavelength of millimeters or smaller than millimeters.
    On the computer
    An acquisition process for acquiring a waveform signal based on an electromagnetic wave received by the electromagnetic wave sensor, and a detection process for detecting a predetermined gesture indicated by two fingers based on the waveform signal acquired in the acquisition process. To be executed, the predetermined gesture indicates the length by the distance between one tip of the two fingers and the other tip of the two fingers.
    Furthermore, to the computer
    The first measurement process for measuring the distance between the two points of the point indicating the tip of the one finger and the point indicating the tip of the other finger shown in the predetermined gesture detected in the detection process, and the first measurement process. 1 The determination process of determining the length instructed by the predetermined gesture based on the distance between the two points measured in the measurement process is executed.
    A program characterized by that.
  2. 請求項1に記載するプログラムにおいて、
     前記第1計測処理では、
      前記検出処理にて前記所定のジェスチャを一定期間以上に渡って同じ位置で連続して検出した場合に、前記2点間距離を計測する、
     ことを特徴とするプログラム。
    In the program according to claim 1,
    In the first measurement process,
    When the predetermined gesture is continuously detected at the same position for a certain period of time or longer by the detection process, the distance between the two points is measured.
    A program characterized by that.
  3. 請求項1に記載するプログラムにおいて、
     前記コンピュータに、
      前記電子デバイスの入力インタフェースを介して、計測指示を受け付ける受付処理を実行させ、
      前記受付処理にて前記計測指示を受け付けたことを契機に、前記第1計測処理を実行させる、
     ことを特徴とするプログラム。
    In the program according to claim 1,
    On the computer
    A reception process for receiving measurement instructions is executed via the input interface of the electronic device.
    When the measurement instruction is received in the reception process, the first measurement process is executed.
    A program characterized by that.
  4. 請求項1から請求項3のいずれか1つに記載するプログラムにおいて、
     前記コンピュータに、
      長尺の印刷媒体に画像を印刷する印刷ジョブを生成する印刷処理を実行させ、前記印刷ジョブの属性には、前記印刷媒体の長手方向の長さが含まれ、
      前記決定処理では、
       前記指示される長さとして、前記印刷媒体の前記長手方向の長さを決定する、 ことを特徴とするプログラム。
    In the program according to any one of claims 1 to 3.
    On the computer
    A print process for generating a print job for printing an image on a long print medium is executed, and the attributes of the print job include the length in the longitudinal direction of the print medium.
    In the determination process,
    A program comprising determining the length of the print medium in the longitudinal direction as the indicated length.
  5. 請求項4に記載するプログラムにおいて、
     前記コンピュータに、
      前記印刷媒体のプレビューを前記電子デバイスのディスプレイに表示させるプレビュー表示処理を実行させ、前記決定処理にて前記印刷媒体の前記長手方向の長さを決定した場合に、前記プレビュー表示処理では、前記プレビュー表示処理にて表示される前記印刷媒体のプレビューも前記決定処理にて決定された前記印刷媒体の前記長手方向の長さに基づいて変更する、
     ことを特徴とするプログラム。
    In the program according to claim 4,
    On the computer
    When a preview display process for displaying a preview of the print medium on the display of the electronic device is executed and the length of the print medium in the longitudinal direction is determined by the determination process, the preview display process determines the length of the print medium in the longitudinal direction. The preview of the print medium displayed in the display process is also changed based on the length of the print medium determined in the determination process in the longitudinal direction.
    A program characterized by that.
  6. 請求項1から請求項5のいずれか1つに記載するプログラムにおいて、
     前記決定処理では、
      前記第1計測処理にて計測された前記2点間距離を、等倍で前記指示される長さに決定する、
     ことを特徴とするプログラム。
    In the program according to any one of claims 1 to 5.
    In the determination process,
    The distance between the two points measured in the first measurement process is determined to be the indicated length at the same magnification.
    A program characterized by that.
  7. 請求項1から請求項6のいずれか1つに記載するプログラムにおいて、
     前記コンピュータに、前記第1計測処理を実行した後であって前記決定処理を実行する前に、
      前記第1計測処理にて計測された前記2点間距離を前記指示される長さとして決定するか否かを、前記電子デバイスの入力インタフェースを介して受け付ける確認処理を実行させ、
      前記確認処理にて前記第1計測処理にて計測された前記2点間距離を前記指示される長さとして決定することを受け付けた場合に、前記決定処理を実行し、
      前記確認処理にて前記第1計測処理にて計測された前記2点間距離を前記指示される長さとして決定しないことを受け付けた場合に、前記決定処理を実行しない、
     ことを特徴とするプログラム。
    In the program according to any one of claims 1 to 6.
    After executing the first measurement process on the computer and before executing the determination process.
    A confirmation process is executed in which whether or not the distance between the two points measured in the first measurement process is determined as the indicated length is received via the input interface of the electronic device.
    When it is accepted in the confirmation process that the distance between the two points measured in the first measurement process is determined as the indicated length, the determination process is executed.
    When the confirmation process accepts that the distance between the two points measured in the first measurement process is not determined as the indicated length, the determination process is not executed.
    A program characterized by that.
  8. 請求項1から請求項7のいずれか1つに記載するプログラムにおいて、
     前記コンピュータに、
      前記取得処理にて取得した前記波形信号に基づいて、前記所定のジェスチャと前記電子デバイスとの距離を計測する第2計測処理を実行させ、
     前記決定処理では、
      第1対象物と第2対象物との2種類の対象物の前記指示される長さを決定することが可能であり、前記第2計測処理にて計測された前記距離が閾値よりも短い場合、前記第1計測処理にて計測された前記2点間距離に基づいて、前記第1対象物の前記指示される長さを決定し、前記第2計測処理にて計測された前記距離が前記閾値よりも短くない場合、前記第1計測処理にて計測された前記2点間距離に基づいて、前記第2対象物の前記指示される長さを決定する、
     ことを特徴とするプログラム。
    In the program according to any one of claims 1 to 7.
    On the computer
    Based on the waveform signal acquired in the acquisition process, a second measurement process for measuring the distance between the predetermined gesture and the electronic device is executed.
    In the determination process,
    When it is possible to determine the indicated length of two types of objects, the first object and the second object, and the distance measured by the second measurement process is shorter than the threshold value. Based on the distance between the two points measured in the first measurement process, the indicated length of the first object is determined, and the distance measured in the second measurement process is the distance. If it is not shorter than the threshold value, the indicated length of the second object is determined based on the distance between the two points measured in the first measurement process.
    A program characterized by that.
  9. 請求項1から請求項8のいずれか1つに記載するプログラムにおいて、
     前記コンピュータに、
      基準の長さを指定した第1状態で前記計測処理を実行した場合、前記決定処理を実行せず、前記計測処理にて計測された前記2点間距離を、前記基準の長さと対応付けて保存する保存処理を実行させ、
      基準の長さを指定していない第2状態で前記計測処理を実行した場合、前記決定処理を実行させ、前記決定処理では、前記第1計測処理にて計測された前記2点間距離と、前記保存処理にて保存された前記基準の長さと対応付けられた前記2点間距離と、に基づいて、前記指示される長さを決定する、
     ことを特徴とするプログラム。
    In the program according to any one of claims 1 to 8.
    On the computer
    When the measurement process is executed in the first state in which the reference length is specified, the determination process is not executed, and the distance between the two points measured by the measurement process is associated with the reference length. Execute the save process to save and save
    When the measurement process is executed in the second state in which the reference length is not specified, the determination process is executed, and in the determination process, the distance between the two points measured in the first measurement process and the distance between the two points are determined. The indicated length is determined based on the reference length preserved in the preservation process and the distance between the two points associated with the reference length.
    A program characterized by that.
  10. 請求項9に記載するプログラムにおいて、
     前記コンピュータに、
      前記第1状態の場合に、前記基準の長さを示す画像を、前記電子デバイスのディスプレイに表示させる基準表示処理を実行させる、
     ことを特徴とするプログラム。
    In the program according to claim 9.
    On the computer
    In the case of the first state, a reference display process for displaying an image showing the reference length on the display of the electronic device is executed.
    A program characterized by that.
  11.  電磁波センサと、
     コンピュータと、
    を備え、
     前記電磁波センサは、
      波長がミリメートル単位あるいはミリメートルよりも小さい単位の電磁波を受信するセンサであり、
     前記コンピュータは、
      前記電磁波センサが受信した電磁波に基づく波形信号を取得する取得処理と、  前記取得処理にて取得した前記波形信号に基づいて、2本の指で示される所定のジェスチャを検出する検出処理と、を実行し、前記所定のジェスチャは、前記2本の指の一方の先端と前記2本の指の他方の先端との距離で長さを示すものであり、
     さらに前記コンピュータは、
      前記検出処理にて検出した前記所定のジェスチャに示される前記一方の指の先端を示す点と前記他方の指の先端を示す点との2点間距離を計測する第1計測処理と、  前記第1計測処理にて計測された前記2点間距離に基づいて、前記所定のジェスチャにより指示される長さを決定する決定処理と、を実行する、
     ことを特徴とする電子デバイス。
    Electromagnetic wave sensor and
    With a computer
    With
    The electromagnetic wave sensor is
    A sensor that receives electromagnetic waves with wavelengths in millimeters or smaller than millimeters.
    The computer
    An acquisition process for acquiring a waveform signal based on an electromagnetic wave received by the electromagnetic wave sensor, and a detection process for detecting a predetermined gesture indicated by two fingers based on the waveform signal acquired in the acquisition process. When executed, the predetermined gesture indicates the length by the distance between one tip of the two fingers and the other tip of the two fingers.
    Furthermore, the computer
    The first measurement process for measuring the distance between the two points of the point indicating the tip of the one finger and the point indicating the tip of the other finger shown in the predetermined gesture detected in the detection process, and the first measurement process. 1 The determination process of determining the length instructed by the predetermined gesture based on the distance between the two points measured in the measurement process is executed.
    An electronic device characterized by that.
  12.  電磁波センサを備える電子デバイスのコンピュータによって実行可能なプログラムであって、前記電磁波センサは、波長がミリメートル単位あるいはミリメートルよりも小さい単位の電磁波を受信するセンサであり、
     前記コンピュータに、
      前記電磁波センサが受信した電磁波に基づく波形信号を取得する取得処理と、  前記取得処理にて取得した前記波形信号に基づいて、2本の指で示される所定のジェスチャを検出する検出処理と、を実行させ、前記所定のジェスチャは、前記2本の指の一方の先端と前記2本の指の他方の先端との距離で指示レベルを示すものであり、 さらに前記コンピュータに、
      前記検出処理にて検出した前記所定のジェスチャに示される前記一方の指の先端を示す点と前記他方の指の先端を示す点との2点間距離を計測する第1計測処理と、  前記第1計測処理にて計測された前記2点間距離に基づいて、前記所定のジェスチャにより指示される指示レベルを決定する決定処理と、を実行させる、
     ことを特徴とするプログラム。
    A program that can be executed by a computer of an electronic device including an electromagnetic wave sensor, the electromagnetic wave sensor is a sensor that receives electromagnetic waves having a wavelength of millimeters or smaller than millimeters.
    On the computer
    An acquisition process for acquiring a waveform signal based on an electromagnetic wave received by the electromagnetic wave sensor, and a detection process for detecting a predetermined gesture indicated by two fingers based on the waveform signal acquired in the acquisition process. The predetermined gesture to be executed indicates the instruction level by the distance between one tip of the two fingers and the other tip of the two fingers, and further causes the computer to perform the instruction level.
    The first measurement process for measuring the distance between the two points of the point indicating the tip of the one finger and the point indicating the tip of the other finger shown in the predetermined gesture detected in the detection process, and the first measurement process. 1 The determination process of determining the instruction level instructed by the predetermined gesture based on the distance between the two points measured in the measurement process is executed.
    A program characterized by that.
  13.  電磁波センサと、
     コンピュータと、
    を備え、
     前記電磁波センサは、
      波長がミリメートル単位あるいはミリメートルよりも小さい単位の電磁波を受信するセンサであり、
     前記コンピュータは、
      前記電磁波センサが受信した電磁波に基づく波形信号を取得する取得処理と、  前記取得処理にて取得した前記波形信号に基づいて、2本の指で示される所定のジェスチャを検出する検出処理と、を実行し、前記所定のジェスチャは、前記2本の指の一方の先端と前記2本の指の他方の先端との距離で指示レベルを示すものであり、
     さらに前記コンピュータは、
      前記検出処理にて検出した前記所定のジェスチャに示される前記一方の指の先端を示す点と前記他方の指の先端を示す点との2点間距離を計測する第1計測処理と、  前記第1計測処理にて計測された前記2点間距離に基づいて、前記所定のジェスチャにより指示される指示レベルを決定する決定処理と、を実行する、
     ことを特徴とする電子デバイス。
    Electromagnetic wave sensor and
    With a computer
    With
    The electromagnetic wave sensor is
    A sensor that receives electromagnetic waves with wavelengths in millimeters or smaller than millimeters.
    The computer
    An acquisition process for acquiring a waveform signal based on an electromagnetic wave received by the electromagnetic wave sensor, and a detection process for detecting a predetermined gesture indicated by two fingers based on the waveform signal acquired in the acquisition process. When executed, the predetermined gesture indicates the indicated level by the distance between one tip of the two fingers and the other tip of the two fingers.
    Furthermore, the computer
    The first measurement process for measuring the distance between the two points of the point indicating the tip of the one finger and the point indicating the tip of the other finger shown in the predetermined gesture detected in the detection process, and the first measurement process. 1 The determination process of determining the instruction level instructed by the predetermined gesture based on the distance between the two points measured in the measurement process is executed.
    An electronic device characterized by that.
PCT/JP2021/007856 2020-03-24 2021-03-02 Program and electronic device WO2021192841A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2020-052666 2020-03-24
JP2020052666A JP2021152727A (en) 2020-03-24 2020-03-24 Program and electronic device

Publications (1)

Publication Number Publication Date
WO2021192841A1 true WO2021192841A1 (en) 2021-09-30

Family

ID=77886598

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2021/007856 WO2021192841A1 (en) 2020-03-24 2021-03-02 Program and electronic device

Country Status (2)

Country Link
JP (1) JP2021152727A (en)
WO (1) WO2021192841A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013077215A (en) * 2011-09-30 2013-04-25 Rakuten Inc Retrieval device, retrieval method, recording medium, and program
JP2014211769A (en) * 2013-04-18 2014-11-13 キヤノン株式会社 Information processing apparatus and control method of the same
JP2017505900A (en) * 2013-12-26 2017-02-23 インターナショナル・ビジネス・マシーンズ・コーポレーションInternational Business Machines Corporation Radar integration with handheld electronic devices
JP2017524170A (en) * 2014-08-07 2017-08-24 グーグル インコーポレイテッド Radar-based gesture recognition

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108604733B (en) * 2016-01-26 2021-07-30 纽威莱克公司 Millimeter wave sensor system for gesture and motion analysis
US10466772B2 (en) * 2017-01-09 2019-11-05 Infineon Technologies Ag System and method of gesture detection for a remote device
US20200026360A1 (en) * 2018-07-19 2020-01-23 Infineon Technologies Ag Gesture Detection System and Method Using Radar Sensors
US20200073480A1 (en) * 2018-08-31 2020-03-05 Qualcomm Incorporated GESTURE CLASSIFICATION AND CONTROL USING mm WAVE RADAR
CN110658516B (en) * 2019-10-14 2022-11-25 重庆邮电大学 Gesture target extraction method based on FMCW radar variance frequency statistics
CN110765974B (en) * 2019-10-31 2023-05-02 复旦大学 Micro gesture recognition method based on millimeter wave radar and convolutional neural network

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013077215A (en) * 2011-09-30 2013-04-25 Rakuten Inc Retrieval device, retrieval method, recording medium, and program
JP2014211769A (en) * 2013-04-18 2014-11-13 キヤノン株式会社 Information processing apparatus and control method of the same
JP2017505900A (en) * 2013-12-26 2017-02-23 インターナショナル・ビジネス・マシーンズ・コーポレーションInternational Business Machines Corporation Radar integration with handheld electronic devices
JP2017524170A (en) * 2014-08-07 2017-08-24 グーグル インコーポレイテッド Radar-based gesture recognition

Also Published As

Publication number Publication date
JP2021152727A (en) 2021-09-30

Similar Documents

Publication Publication Date Title
US7705831B2 (en) Pad type input device and scroll controlling method using the same
JP4734435B2 (en) Portable game device with touch panel display
US20070076082A1 (en) Methods and apparatuses for measuring print area using hand-held printer
US20120229471A1 (en) Drawing system
WO2014061626A1 (en) Touch panel-type input device, and control method and program thereof
CN109164171B (en) Ultrasonic detection method, ultrasonic detection system and related device
JP2008134918A (en) Image processor and image processing determination method
JP2013242821A (en) Picture display device and picture operation method of the same
JP5488381B2 (en) Print data editing device, control program for print data editing device
JP6155702B2 (en) Printing apparatus, printing apparatus control method, and control apparatus
JP2015109050A (en) Paper medium, information input device, and information input program
WO2021192841A1 (en) Program and electronic device
JP2009187311A (en) Information processor and information processing method
WO2011001945A1 (en) Image processing device, image processing method, and storage medium
JP7268413B2 (en) program
US9430692B2 (en) Fingerprint minutia display input device, fingerprint minutia display input method, and fingerprint minutia display input program
WO2022030221A1 (en) Program and electronic device
JP2016119019A (en) Information processing apparatus, information processing method, and program
JP2021157707A (en) Program and electronic device
JP5769841B2 (en) Portable game device with touch panel display
JP2021082930A (en) Program, and information processing device
US11580340B2 (en) Computer-readable medium, electronic device, and method for causing image processing device to perform preliminary operation
JP2021149314A (en) Program and electronic device
JP2021081889A (en) Program, information processing device and printer
US11977791B2 (en) Editing method for editing layout of displayed objects including function for enabling switching of selection condition defining objects to be selected based on selection operation, and corresponding recording medium, information processing apparatus, and printing apparatus

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21775475

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21775475

Country of ref document: EP

Kind code of ref document: A1