WO2014175395A1 - 装置、方法、及びプログラム - Google Patents
装置、方法、及びプログラム Download PDFInfo
- Publication number
- WO2014175395A1 WO2014175395A1 PCT/JP2014/061606 JP2014061606W WO2014175395A1 WO 2014175395 A1 WO2014175395 A1 WO 2014175395A1 JP 2014061606 W JP2014061606 W JP 2014061606W WO 2014175395 A1 WO2014175395 A1 WO 2014175395A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- touch screen
- displayed
- smartphone
- screen
- controller
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/02—Input arrangements using manually operated switches, e.g. using keyboards or dials
- G06F3/023—Arrangements for converting discrete items of information into a coded form, e.g. arrangements for interpreting keyboard generated codes as alphanumeric codes, operand codes or instruction codes
- G06F3/0233—Character input methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/0412—Digitisers structurally integrated in a display
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/0486—Drag-and-drop
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
Definitions
- This application relates to an apparatus, a method, and a program.
- the present application relates to a device having a touch screen, a method for controlling the device, and a program for controlling the device.
- a touch screen device equipped with a touch screen is known.
- Touch screen devices include, for example, smartphones and tablets.
- the touch screen device detects a finger, pen, or stylus pen gesture via the touch screen.
- the touch screen device operates according to the detected gesture.
- An example of the operation according to the detected gesture is described in Patent Document 1, for example.
- the basic operation of the touch screen device is realized by an OS (Operating System) installed in the device.
- the OS installed in the touch screen device is, for example, Android (registered trademark), BlackBerry (registered trademark) OS, iOS, Symbian (registered trademark) OS, or Windows (registered trademark) Phone.
- the apparatus detects a touch screen display and the touch screen display detects that an object has moved in a predetermined direction while being in contact with the touch screen display, a character along the trajectory of the movement is detected.
- a controller that displays an object that can be input on the screen displayed on the touch screen display.
- the device when a swipe is performed on the touch screen display and the touch screen display, an object capable of inputting characters is displayed on the touch screen display along a swipe trajectory. And a controller to be displayed on the screen.
- An apparatus includes a touch screen display and a controller that, when the touch screen display is traced, displays an object on which characters can be input on the screen displayed on the touch screen display in the traced area. And comprising.
- a method is a method of controlling an apparatus including a touch screen display, and when the touch screen display detects that an object has moved in a predetermined direction while being in contact with the touch screen display, And displaying an object on which characters can be input on the screen displayed on the touch screen display along the movement trajectory.
- a program When a program according to one aspect detects that an object has moved in a predetermined direction while being in contact with the touch screen display by an apparatus including a touch screen display, the program follows the movement trajectory. Then, a step of displaying an object capable of inputting characters on a screen displayed on the touch screen display is executed.
- FIG. 1 is a perspective view illustrating an appearance of a smartphone according to the embodiment.
- FIG. 2 is a front view illustrating an appearance of the smartphone according to the embodiment.
- FIG. 3 is a rear view illustrating an appearance of the smartphone according to the embodiment.
- FIG. 4 is a diagram illustrating an example of the home screen.
- FIG. 5 is a block diagram illustrating functions of the smartphone according to the embodiment.
- FIG. 6 is a diagram illustrating a first example of control performed by the smartphone according to the embodiment.
- FIG. 7 is a flowchart illustrating a control processing procedure performed by the smartphone according to the embodiment.
- FIG. 8 is a diagram illustrating processing for inputting characters into the memo object.
- FIG. 9A is a diagram illustrating a second example of control performed by the smartphone according to the embodiment.
- FIG. 9A is a diagram illustrating a second example of control performed by the smartphone according to the embodiment.
- FIG. 9B is a diagram illustrating a second example of control performed by the smartphone according to the embodiment.
- FIG. 10 is a diagram illustrating a third example of control performed by the smartphone according to the embodiment.
- FIG. 11A is a diagram illustrating a fourth example of control performed by the smartphone according to the embodiment.
- FIG. 11B is a diagram illustrating a fourth example of control performed by the smartphone according to the embodiment.
- FIG. 12 is a diagram illustrating a fifth example of control performed by the smartphone according to the embodiment.
- FIG. 13 is a diagram illustrating a sixth example of control performed by the smartphone according to the embodiment.
- the smartphone 1 has a housing 20.
- the housing 20 includes a front face 1A, a back face 1B, and side faces 1C1 to 1C4.
- the front face 1 ⁇ / b> A is the front of the housing 20.
- the back face 1 ⁇ / b> B is the back surface of the housing 20.
- the side faces 1C1 to 1C4 are side faces that connect the front face 1A and the back face 1B.
- the side faces 1C1 to 1C4 may be collectively referred to as the side face 1C without specifying which face.
- the smartphone 1 has a touch screen display 2, buttons 3A to 3C, an illuminance sensor 4, a proximity sensor 5, a receiver 7, a microphone 8, and a camera 12 on the front face 1A.
- the smartphone 1 has a speaker 11 and a camera 13 on the back face 1B.
- the smartphone 1 has buttons 3D to 3F and a connector 14 on the side face 1C.
- the buttons 3A to 3F may be collectively referred to as the button 3 without specifying which button.
- the touch screen display 2 has a display 2A and a touch screen 2B.
- the display 2A includes a display device such as a liquid crystal display (Liquid Crystal Display), an organic EL panel (Organic Electro-Luminescence panel), or an inorganic EL panel (Inorganic Electro-Luminescence panel).
- the display 2A displays characters, images, symbols, graphics, and the like.
- the touch screen 2B detects contact of a finger, a pen, a stylus pen, or the like with the touch screen 2B.
- the touch screen 2B can detect a position where a plurality of fingers, a pen, a stylus pen, or the like is in contact with the touch screen 2B.
- the detection method of the touch screen 2B may be any method such as a capacitance method, a resistive film method, a surface acoustic wave method (or an ultrasonic method), an infrared method, an electromagnetic induction method, and a load detection method.
- a finger, pen, stylus pen, or the like that the touch screen 2B detects contact may be simply referred to as a “finger”.
- the smartphone 1 determines the type of gesture based on at least one of the contact detected by the touch screen 2B, the position at which the contact is detected, the interval at which the contact is detected, and the number of times the contact is detected.
- the gesture is an operation performed on the touch screen 2B.
- the gesture discriminated by the smartphone 1 includes touch, long touch, release, swipe, tap, double tap, long tap, drag, flick, pinch in, pinch out, and the like.
- “Touch” is a gesture in which a finger touches the touch screen 2B.
- the smartphone 1 determines a gesture in which a finger contacts the touch screen 2B as a touch.
- the smartphone 1 determines a gesture in which a single finger contacts the touch screen 2B as a single touch.
- the smartphone 1 determines a gesture in which a plurality of fingers come into contact with the touch screen 2B as multi-touch. In the case of multi-touch, the smartphone 1 detects the number of fingers whose contact has been detected.
- “Long touch” is a gesture in which a finger touches the touch screen 2B for a predetermined time or more.
- the smartphone 1 determines a gesture in which a finger is in contact with the touch screen 2B for a predetermined time or more as a long touch.
- “Release” is a gesture in which a finger leaves the touch screen 2B.
- the smartphone 1 determines that a gesture in which a finger leaves the touch screen 2B is a release.
- “Swipe” is a gesture in which a finger moves while touching the touch screen 2B.
- the smartphone 1 determines a gesture that moves while the finger is in contact with the touch screen 2B as a swipe.
- “Tap” is a gesture for releasing following a touch.
- the smartphone 1 determines a gesture that is released following a touch as a tap.
- the “double tap” is a gesture in which a gesture for releasing following a touch is continued twice.
- the smartphone 1 determines a gesture in which a gesture for releasing following a touch is continued twice as a double tap.
- “Long tap” is a gesture to release following a long touch.
- the smartphone 1 determines a gesture for releasing following a long touch as a long tap.
- “Drag” is a gesture for performing a swipe starting from an area where a movable object is displayed.
- the smartphone 1 determines, as a drag, a gesture for performing a swipe starting from an area where a movable object is displayed.
- “Flick” is a gesture that is released while the finger moves in one direction following the touch.
- the smartphone 1 determines, as a flick, a gesture that is released while the finger moves in one direction following the touch.
- the movement speed of the flick is higher than the movement speed of the swipe and the drag.
- Flick is "upper flick” where the finger moves upward on the screen, “lower flick” where the finger moves downward on the screen, “right flick” where the finger moves rightward on the screen, finger is left on the screen “Left flick” that moves in the direction, “Slant upper left flick” that moves the finger diagonally in the upper left direction of the screen, “Slant lower left flick” that moves the finger in the lower left direction of the screen, and the finger moves in the upper right direction of the screen “Oblique right upper flick”, “oblique right flick” in which the finger moves diagonally lower right on the screen, and the like.
- “Pinch-in” is a gesture that swipes in the direction in which multiple fingers approach.
- the smartphone 1 determines, as a pinch-in, a gesture that swipes in a direction in which a plurality of fingers approach.
- “Pinch out” is a gesture of swiping in a direction in which a plurality of fingers move away.
- the smartphone 1 determines a gesture of swiping in a direction in which a plurality of fingers move away as a pinch out.
- the smartphone 1 operates according to these gestures determined via the touch screen 2B. Therefore, an operability that is intuitive and easy to use for the user is realized.
- the operation performed by the smartphone 1 according to the determined gesture differs depending on the screen displayed on the display 2A.
- the touch screen 2B detects contact and the smartphone 1 determines that the gesture type is X based on the detected contact”
- the controller detects X”.
- FIG. 4 shows an example of the home screen.
- the home screen may be called a desktop or a standby screen.
- the home screen is displayed on the display 2A.
- the home screen is a screen that allows the user to select which application to execute from among the applications installed in the smartphone 1.
- the smartphone 1 executes the application selected on the home screen in the foreground.
- the screen of the application executed in the foreground is displayed on the display 2A.
- Smartphone 1 can place an icon on the home screen.
- a plurality of icons 50 are arranged on the home screen 40 shown in FIG.
- Each icon 50 is associated with an application installed in the smartphone 1 in advance.
- the smartphone 1 detects a gesture for the icon 50, the smartphone 1 executes an application associated with the icon 50.
- the smartphone 1 executes the mail application when a tap on the icon 50 associated with the mail application is detected.
- the icon 50 includes an image and a character string.
- the icon 50 may include a symbol or a graphic instead of the image.
- the icon 50 may not include either an image or a character string.
- the icons 50 are arranged based on the arrangement pattern.
- a wallpaper 41 is displayed behind the icon 50.
- the wallpaper is sometimes called a photo screen or a back screen.
- the smartphone 1 can use any image as the wallpaper 41.
- An arbitrary image may be determined as the wallpaper 41 according to the setting of the user.
- the smartphone 1 can increase or decrease the number of home screens 40. For example, the smartphone 1 determines the number of home screens 40 according to the setting by the user. The smartphone 1 displays the selected one on the display 2 ⁇ / b> A even if there are a plurality of home screens 40.
- the smartphone 1 displays a locator 51 including one or more symbols on the home screen 40.
- the number of symbols matches the number of home screens 40.
- the locator 51 indicates which home screen 40 is currently displayed. In the locator 51, the symbol corresponding to the currently displayed home screen 40 is displayed in a manner different from other symbols.
- a locator 51 including three symbols is displayed. This indicates that the number of home screens 40 is three. Each of the three symbols is circular.
- the leftmost symbol is displayed in a manner different from other symbols. That is, symbols other than the leftmost symbol are displayed as a circular frame, but the leftmost symbol is displayed in a state where the inside of the circular frame is filled. This indicates that the leftmost home screen 40 is currently displayed on the display 2A.
- the smartphone 1 When the smartphone 1 detects a gesture in the horizontal direction while displaying the home screen 40, the smartphone 1 switches the home screen 40 displayed on the display 2A according to the gesture. For example, when detecting a right flick, the smartphone 1 switches the home screen 40 displayed on the display 2 ⁇ / b> A to the left home screen 40. For example, when detecting a left flick, the smartphone 1 switches the home screen 40 displayed on the display 2A to the home screen 40 on the right.
- the process in which the smartphone 1 switches the home screen 40 in the horizontal direction according to the left / right flick gesture will be described in detail later.
- the smartphone 1 switches the home screen 40 in the horizontal direction, the display of the locator 51 is updated according to the position of the home screen 40 after switching.
- a region 42 is provided at the upper end of the display 2A.
- a remaining amount mark 43 indicating the remaining amount of the rechargeable battery and a radio wave level mark 44 indicating the electric field strength of the communication radio wave are displayed.
- the smartphone 1 may display the time, weather, running application, communication system type, telephone status, device mode, event occurring in the device, and the like in the area 42.
- the area 42 is used for various notifications to the user.
- the area 42 may be provided on a screen other than the home screen 40. The position where the region 42 is provided is not limited to the upper end of the display 2A.
- the vertical direction of the home screen 40 is a direction based on the vertical direction of characters or images displayed on the display 2A. Therefore, in the home screen 40, the side closer to the region 42 in the longitudinal direction of the touch screen display 2 is the upper side of the home screen 40, and the side far from the region 42 is the lower side of the home screen 40.
- the side where the radio wave level mark 44 is displayed in the area 42 is the right side of the home screen 40, and the side where the remaining amount mark 43 is displayed in the area 42 is the left side of the home screen 40.
- the smartphone 1 determines, for example, the upper left direction, the lower right direction, the left direction, and the right direction of the home screen 40 based on the vertical direction of the characters or images displayed on the home screen 40.
- the home screen 40 shown in FIG. 4 is an example, and the forms of various elements, the arrangement of various elements, the number of home screens 40, how to perform various operations on the home screen 40, and the like are as described above. It does not have to be.
- FIG. 5 is a block diagram showing the configuration of the smartphone 1.
- the smartphone 1 includes a touch screen display 2, a button 3, an illuminance sensor 4, a proximity sensor 5, a communication unit 6, a receiver 7, a microphone 8, a storage 9, a controller 10, a speaker 11, and a camera. 12 and 13, a connector 14, an acceleration sensor 15, an orientation sensor 16, and a gyroscope 17.
- the touch screen display 2 has the display 2A and the touch screen 2B as described above.
- the display 2A displays characters, images, symbols, graphics, or the like.
- the touch screen 2B detects contact.
- the controller 10 detects a gesture for the smartphone 1.
- the controller 10 detects an operation (gesture) on the touch screen 2B (touch screen display 2) by cooperating with the touch screen 2B.
- the button 3 is operated by the user.
- the button 3 has buttons 3A to 3F.
- the controller 10 detects an operation on the button 3 by cooperating with the button 3.
- the operations on the button 3 are, for example, click, double click, triple click, push, and multi-push.
- the buttons 3A to 3C are, for example, a home button, a back button, or a menu button.
- the button 3D is, for example, a power on / off button of the smartphone 1.
- the button 3D may also serve as a sleep / sleep release button.
- the buttons 3E and 3F are volume buttons, for example.
- the illuminance sensor 4 detects illuminance. Illuminance indicates light intensity, brightness, or luminance. The illuminance sensor 4 is used for adjusting the luminance of the display 2A, for example.
- the proximity sensor 5 detects the presence of a nearby object without contact. The proximity sensor 5 detects that the touch screen display 2 is brought close to the face, for example.
- the illuminance sensor 4 and the proximity sensor 5 may be configured as one sensor.
- the communication unit 6 communicates wirelessly.
- the communication method performed by the communication unit 6 is a wireless communication standard.
- wireless communication standards include cellular phone communication standards such as 2G, 3G, and 4G.
- Cellular phone communication standards include, for example, LTE (Long Term Evolution), W-CDMA (Wideband Code Multiple Access), CDMA2000, PDC (Personal Digital Cellular), GSM (Regular Trademark) (GloSmMobleSport). (Personal Handy-phone System).
- Examples of wireless communication standards include WiMAX (Worldwide Interoperability for Microwave Access), IEEE802.11, Bluetooth (registered trademark), IrDA (Infrared Data Association), NFC (NearFild, etc.).
- the communication unit 6 may support one or more of the communication standards described above.
- the receiver 7 and the speaker 11 output the sound signal transmitted from the controller 10 as sound.
- the receiver 7 is used, for example, to output the other party's voice during a call.
- the speaker 11 is used for outputting a ring tone and music, for example.
- One of the receiver 7 and the speaker 11 may also function as the other.
- the microphone 8 converts the voice of the user or the like into a voice signal and transmits it to the controller 10.
- the storage 9 stores programs and data.
- the storage 9 is also used as a work area for temporarily storing the processing result of the controller 10.
- the storage 9 may include any non-transitory storage device such as a semiconductor storage device and a magnetic storage device.
- the storage 9 may include a plurality of types of storage devices.
- the storage 9 may include a combination of a portable storage medium such as a memory card, an optical disk, or a magneto-optical disk and a storage medium reader.
- the program stored in the storage 9 includes an application executed in the foreground or the background and a control program that supports the operation of the application.
- the application displays a screen on the display 2A, and causes the controller 10 to execute processing according to a gesture detected via the touch screen 2B.
- the control program is, for example, an OS.
- the application and the control program may be installed in the storage 9 via wireless communication by the communication unit 6 or a non-transitory storage medium.
- the storage 9 stores, for example, a control program 9A, a mail application 9B, a browser application 9C, and setting data 9Z.
- the mail application 9B provides an electronic mail function.
- the e-mail function enables, for example, creation, transmission, reception, and display of an e-mail.
- the browser application 9C provides a WEB browsing function.
- the WEB browsing function enables, for example, display of a WEB page and editing of a bookmark.
- the setting data 9Z provides various setting functions related to the operation of the smartphone 1.
- the control program 9A provides functions related to various controls for operating the smartphone 1.
- the control program 9A realizes a call by controlling the communication unit 6, the receiver 7, the microphone 8, and the like, for example.
- the function provided by the control program 9A includes a function of changing the home screen 40 to be displayed according to the gesture.
- the function provided by the control program 9A may be used in combination with a function provided by another program such as the mail application 9B.
- the controller 10 is an arithmetic circuit.
- the arithmetic circuit is, for example, a CPU (Central Processing Unit), an SoC (System-on-a-chip), an MCU (Micro Control Unit), or an FPGA (Field-Programmable Gate Array).
- the controller 10 controls various operations of the smartphone 1 to realize various functions.
- the controller 10 executes instructions included in the program stored in the storage 9 while referring to the data stored in the storage 9 as necessary. Then, the controller 10 controls the functional units such as the display 2A and the communication unit 6 in accordance with data and commands, thereby realizing various functions.
- the controller 10 may change the control according to the detection result of the detection unit.
- the functional unit includes the display 2A, the communication unit 6, the microphone 8, and the speaker 11, but is not limited thereto.
- the detection unit includes, but is not limited to, the touch screen 2B, the button 3, the illuminance sensor 4, the proximity sensor 5, the receiver 7, the camera 12, the camera 13, the acceleration sensor 15, the azimuth sensor 16, and the gyroscope 17.
- the controller 10 changes the home screen 40 displayed according to the gesture, for example, by executing the control program 9A.
- the camera 12 is an in-camera that captures an object facing the front face 1A.
- the camera 13 is an out camera that captures an object facing the back face 1B.
- the connector 14 is a terminal to which other devices are connected.
- the connector 14 may be a general-purpose terminal such as a USB (Universal Serial Bus), HDMI (registered trademark) (High-Definition Multimedia Interface), Light Peak (Thunderbolt (registered trademark)), or an earphone microphone connector.
- the connector 14 may be a dedicated terminal such as a dock connector.
- Devices connected to the connector 14 are, for example, an external storage, a speaker, and a communication device.
- the acceleration sensor 15 detects the direction and magnitude of acceleration acting on the smartphone 1.
- the direction sensor 16 detects the direction of geomagnetism.
- the gyroscope 17 detects the angle and angular velocity of the smartphone 1. The detection results of the acceleration sensor 15, the azimuth sensor 16, and the gyroscope 17 are used in combination in order to detect changes in the position and orientation of the smartphone 1.
- a part or all of the program stored in the storage 9 may be downloaded from another device by wireless communication by the communication unit 6.
- 5 may be stored in a non-transitory storage medium that can be read by a reading device included in the storage 9.
- 5 may be stored in a non-transitory storage medium that can be read by the reading device connected to the connector 14.
- the non-transitory storage medium is, for example, an optical disc such as a CD (registered trademark), a DVD (registered trademark), or a Blu-ray (registered trademark), a magneto-optical disk, or a memory card.
- the configuration of the smartphone 1 shown in FIG. 5 is an example, and may be changed as appropriate without departing from the gist of the present application.
- the number and type of buttons 3 are not limited to the example of FIG.
- the smartphone 1 may include buttons such as a numeric keypad layout or a QWERTY layout instead of the buttons 3A to 3C as buttons for operations related to the screen.
- the smartphone 1 may include only one button or may not include a button for operations related to the screen.
- the smartphone 1 includes two cameras, but the smartphone 1 may include only one camera or may not include a camera.
- the smartphone 1 includes three types of sensors in order to detect the position and orientation, but the smartphone 1 may not include some of these sensors.
- the smartphone 1 may include another type of sensor for detecting at least one of a position and a posture.
- FIG. 6 is a diagram illustrating a first example of control performed by the smartphone according to the embodiment.
- the user moves the finger in the right direction with the finger in contact with a predetermined area near the left end of the home screen 40.
- the user is tracing the surface of the touch screen display 2.
- the controller 10 of the smartphone 1 detects a swipe. Specifically, the controller 10 detects a right swipe starting from a predetermined area near the left end of the home screen 40.
- the controller 10 displays the memo object 55 on the home screen 40 along the movement trajectory.
- the memo object 55 is an object imitating the shape of a sticky note.
- the memo object 55 is displayed such that the part located on the left end side of the home screen 40 that is the start point of the swipe is glued to the home screen 40.
- the memo object 55 is displayed corresponding to the area traced by the user.
- the predetermined area near the left end of the home screen 40 is, for example, between the left end of the home screen 40 and a position separated from the left end of the home screen 40 by a length that is 1/20 of the length in the short direction of the home screen 40. It is an area. Note that the position of the predetermined region and the size of the region can be set as appropriate. For example, the predetermined area may be an area near the right end, the lower end, or the upper end of the home screen 40.
- the controller 10 displays the memo object 55 on the home screen 40
- the memo object 55 follows the user's finger while the swipe is being performed so that the memo object 55 gradually becomes longer (grows gradually). May be displayed. While the swipe is being performed, the controller 10 does not display the memo object 55.
- the controller 10 first displays the memo object 55 at that time. It may be displayed. The controller 10 may make the length of the memo object 55 the same as the length of the swipe trajectory, or may be shorter or longer than the length of the swipe trajectory.
- FIG. 7 is a flowchart illustrating a processing procedure of a first example of control performed by the smartphone according to the embodiment.
- step S ⁇ b> 2 the controller 10 determines whether the detected swipe start point is within the end region of the home screen 40.
- the end area of the home screen 40 is separated from the left end of the home screen 40 by a length that is 1/20 of the length in the short direction of the home screen 40 from the left end of the home screen 40. The area between the positions. If the controller 10 determines that the start point of the detected swipe is within the end region of the home screen 40 (Yes in step S2), the controller 10 proceeds to step S3. If the controller 10 determines that the start point of the detected swipe is not within the end area of the home screen 40 (No in step S2), the controller 10 proceeds to step S4.
- the controller 10 displays the memo object 55 along the swipe trajectory on the home screen 40 as step S3.
- the controller 10 changes the home screen 40 to another home screen as step S4.
- FIG. 8 is a diagram illustrating processing for inputting characters into the memo object.
- the controller 10 of the smartphone 1 displays the memo object 55 by the process shown in FIG. 6, the controller 10 of the smartphone 1 displays the character input screen shown in step S ⁇ b> 11 of FIG.
- the character input screen includes a character input area 60 and a keyboard 61.
- Step S11 in FIG. 8 shows a state in which a character string “14:00 Shibuya” is input to the character input area 60 by the user.
- the controller 10 When the controller 10 detects the tap on the memo object 55 after displaying the memo object 55, the controller 10 may transition to the character input screen. When the swipe is completed, that is, when it is detected that the user's finger has left the touch screen 2B, the controller 10 may display the memo object 55 and automatically shift to the character input screen.
- the importance level of the memo object 55 can be set as shown in step S12 of FIG. Of course, it is not necessary to change to the screen for setting the importance.
- the controller 10 displays the memo object 55 on the home screen 40 as shown in step S13 of FIG. .
- the memo object 55 the character input in the character input area 60 in step S11 of FIG. 8 is displayed.
- the memo object 55 is displayed so as to overlap some of the plurality of icons 50 displayed on the home screen 40.
- FIGS. 9A and 9B are diagrams illustrating a second example of control performed by the smartphone according to the embodiment.
- the home screen 40 displays an icon 50 corresponding to the memo application.
- the user moves the finger in the right direction with the finger in contact with the icon 50.
- the controller 10 of the smartphone 1 detects a right swipe starting from the icon 50.
- the controller 10 displays a memo object 55 that is long in the horizontal direction on the home screen 40 along the movement locus.
- the memo object 55 is displayed as drawn from the icon 50.
- FIG. 9A shows an example in which the user performs a right swipe from the icon 50 corresponding to the memo application, but the swipe direction is not limited to this.
- FIG. 9B the user is performing a swipe downward from the icon 50.
- the controller 10 detects a downward swipe starting from the icon 50, the controller 10 causes the home screen 40 to display a memo object 55 that is long in the vertical direction along the swipe trajectory.
- the controller 10 may determine whether characters can be input vertically or horizontally with respect to the memo object 55 according to the swipe direction. That is, when displaying a horizontally long memo object 55 as shown in FIG. 9A, characters can be input horizontally in the character input area 60 of step S11 of FIG. 8, and a vertically long memo object 55 is displayed as shown in FIG. 9B. In this case, characters may be input vertically in the character input area 60.
- FIG. 10 is a diagram illustrating a third example of control performed by the smartphone according to the embodiment.
- the user performs a left swipe on the home screen 40a on which the memo object 55a is displayed.
- the start point of the swipe is not in the predetermined area at the end of the home screen 40a described above, but near the center of the home screen 40a.
- the controller 10 determines that the detected swipe start point is not within the end region of the home screen 40a as shown in the flowchart of FIG. 7, and the home screen 40a is displayed on the home screen 40a as shown in step S22 of FIG. It changes to the home screen 40b which is a home screen on the right side of the screen 40a.
- the controller 10 displays a memo object 55b different from the memo object 55a originally displayed on the home screen 40b.
- the controller 10 associates the memo object 55a with the home screen 40a, and when a gesture for changing (switching) the home screen 40a is performed, the memo object associated with the home screen 40b displayed on the display 2A. 55b is displayed. For example, when a gesture for changing (switching) the home screen 40a is performed in a state where a tap on the memo object 55a is detected, the controller 10 may display the memo object 55a on the home screen 40b. That is, the controller 10 may change the home screen while displaying the memo object 55a.
- FIG. 11A and FIG. 11B are diagrams illustrating a fourth example of control performed by the smartphone according to the embodiment.
- the user swipes the area 42 downward.
- the controller 10 detects a swipe for the area 42, the controller 10 follows the swipe and displays the drawer 420.
- the drawer 420 is an area where various information such as the presence of unread mail and the update of an application are displayed.
- the controller 10 covers the memo object 55 by the drawer 420 pulled out from the area 42 in response to the user's swipe, and then the user moves toward the area 42 with respect to the drawer 420 (drawn out).
- the swipe in the direction of returning the drawer 420 to the area 42 is performed, the memo object 55 is not displayed. Thereby, the visibility of the home screen 40 is ensured.
- FIG. 12 is a diagram illustrating a fifth example of control performed by the smartphone according to the embodiment.
- the user flicks the memo object 55 displayed on the home screen 40 in the left direction.
- the controller 10 detects a flick in the left direction with respect to the memo object 55
- the memo object 55 is set to the home so that a part of the memo object 55 extends from the end of the home screen 40, as shown in step S32 of FIG. It is displayed on the screen 40.
- step S31 of FIG. 12 when “14:00 Shibuya” is input to the memo object 55, for example, as in step S13 of FIG. 8, the home screen 40 is displayed in the state shown in step S32 of FIG.
- “Shibu 14” may be displayed on a part of the memo object 55 located at the end. That is, when the controller 10 puts the memo object 55 to which a predetermined character string has been input into a state where only a part of the memo object 55 is displayed, You may display. Thereby, while ensuring the visibility of the home screen 40, the user can also recognize the presence of the memo object, which improves usability.
- FIG. 13 is a diagram illustrating a sixth example of control performed by the smartphone according to the embodiment.
- the lock screen 45 is displayed on the display 2A.
- the lock screen 45 is a screen provided to prevent unauthorized use of the smartphone 1.
- the user is swiping the lock screen 45.
- the controller 10 detects the swipe, the memo object 55 is displayed on the lock screen 45 along the swipe trajectory.
- the controller 10 performs a swipe for switching the home screen 40 and a memo object. It is distinguished from the swipe for displaying 55.
- the lock screen 45 does not have a screen to be switched by swipe, as shown in FIG. 13, when the lock screen 45 is displayed, the swipe starts from the center of the lock screen. Even if it exists, the memo object 55 may be displayed. That is, the swipe does not have to start from the end area of the screen.
- the controller 10 When the controller 10 detects a tap on the memo object 55 in a state where the memo object 55 is displayed on the lock screen 45 and detects that an operation for releasing the lock has been performed, the controller 10 displays the lock screen 45 on the lock screen 45.
- the memo object 55 that has been stored may be displayed on the home screen 40.
- the smartphone 1 according to the present embodiment when the smartphone 1 according to the present embodiment is swiped with respect to the home screen 40 or the lock screen 45, the memo object 55 on which characters can be input is displayed along the swipe trajectory.
- a memo object that can display information without starting a specific application can be displayed on the home screen by an intuitive operation, and usability is improved.
- Control of the smartphone 1 according to the present embodiment shown in FIG. 6 is compared with a technique of starting a memo application by tapping an icon arranged on the home screen and then arranging a memo object on the home screen by a predetermined operation. Since the number of steps for arranging the memo object on the home screen is small, user convenience is improved.
- a memo object can be generated starting from the memo application icon arranged on the home screen 40, so that the user can easily arrange the memo object on the screen by an intuitive operation. be able to.
- the smartphone 1 is not limited to the above-described control example, and various embodiments can be considered.
- the controller 10 of the smartphone 1 may automatically hide the memo object 55 to which the time is input when the input time has passed.
- the memo object 55 is displayed in a manner in which the memo object 55 is peeling off from the screen, in a manner in which the color is dull, in a manner in which the scratch is attached, or in a manner in which the memo object 55 is folded. You can.
- the scheduled time has passed to the user by changing the display mode of the memo object 55 whose scheduled time has passed as described above. Can be notified.
- the user who has received the notification can delete the memo object 55 whose time has passed, for example.
- the controller 10 compares the information input to the plurality of memo objects 55, and if the same or similar information is input, displays the plurality of memo objects 55 so that at least a part of both overlaps. You can do it. Thereby, the visibility of the home screen can be ensured.
- the controller 10 may display the memo object 55 when detecting taps at two different points on the home screen 40 or the lock screen 45 at the same time.
- the length of the memo object 55 may be the same as the distance between the two points where the tap is detected.
- controller 10 may delete the memo object 55 when the memo object 55 is swipe in the reverse direction to the swipe when the memo object 55 is generated as shown in FIG.
Abstract
Description
(実施形態)
図1から図3を参照しながら、本実施形態に係るスマートフォン1の全体的な構成について説明する。図1から図3に示すように、スマートフォン1は、ハウジング20を有する。ハウジング20は、フロントフェイス1Aと、バックフェイス1Bと、サイドフェイス1C1~1C4とを有する。フロントフェイス1Aは、ハウジング20の正面である。バックフェイス1Bは、ハウジング20の背面である。サイドフェイス1C1~1C4は、フロントフェイス1Aとバックフェイス1Bとを接続する側面である。以下では、サイドフェイス1C1~1C4を、どの面であるかを特定することなく、サイドフェイス1Cと総称することがある。
2 タッチスクリーンディスプレイ
2A ディスプレイ
2B タッチスクリーン
3 ボタン
4 照度センサ
5 近接センサ
6 通信ユニット
7 レシーバ
8 マイク
9 ストレージ
9A 制御プログラム
9B メールアプリケーション
9C ブラウザアプリケーション
9Z 設定データ
10 コントローラ
11 スピーカ
12、13 カメラ
14 コネクタ
15 加速度センサ
16 方位センサ
17 ジャイロスコープ
20 ハウジング
40 ホーム画面
45 ロック画面
50 アイコン
55 メモオブジェクト
Claims (8)
- タッチスクリーンディスプレイと、
前記タッチスクリーンディスプレイにより物体が当該タッチスクリーンディスプレイに接触したまま所定の方向に移動したことが検出されると、当該移動の軌跡に沿って、文字の入力が可能なオブジェクトを前記タッチスクリーンディスプレイに表示された画面上に表示させるコントローラと、
を備える装置。 - 前記コントローラは、ホーム画面を前記タッチスクリーンディスプレイに表示しているときに、前記ホーム画面の端部領域内を始点とする前記移動が検出されると前記オブジェクトを前記ホーム画面上に表示し、前記ホーム画面の端部領域外を始点とする前記移動が検出されると、表示されているホーム画面を別のホーム画面に変更する
請求項1に記載の装置。 - 前記コントローラは、所定のアイコンが表示されるホーム画面を前記タッチスクリーンディスプレイに表示し、前記物体の前記所定のアイコンに対する接触が検出された後前記物体が接触したまま所定の方向に移動したことが検出されると、前記アイコンを始点とし前記移動の軌跡に沿った前記オブジェクトを前記ホーム画面上に表示させる
請求項1に記載の装置。 - 前記コントローラは、前記移動の方向に応じて、前記オブジェクトに対して文字を縦書きで入力可能とするか或いは横書きで入力可能とするかを決定する
請求項1から請求項3のいずれか一項に記載の装置。 - タッチスクリーンディスプレイと、
前記タッチスクリーンディスプレイに対してスワイプが行われると、当該スワイプの軌跡に沿って、文字の入力が可能なオブジェクトを前記タッチスクリーンディスプレイに表示された画面上に表示させるコントローラと、
を備える装置。 - タッチスクリーンディスプレイと、
前記タッチスクリーンディスプレイがなぞられると、なぞられた領域に、文字の入力が可能なオブジェクトを前記タッチスクリーンディスプレイに表示された画面上に表示させるコントローラと、
を備える装置。 - タッチスクリーンディスプレイを備える装置を制御する方法であって、
前記タッチスクリーンディスプレイにより物体が当該タッチスクリーンディスプレイに接触したまま所定の方向に移動したことを検出するステップと、
検出された当該移動の軌跡に沿って、文字の入力が可能なオブジェクトを前記タッチスクリーンディスプレイに表示された画面上に表示させるステップと
を含む方法。 - タッチスクリーンディスプレイを備える装置に、
前記タッチスクリーンディスプレイにより物体が当該タッチスクリーンディスプレイに接触したまま所定の方向に移動したことを検出するステップと、
検出された当該移動の軌跡に沿って、文字の入力が可能なオブジェクトを前記タッチスクリーンディスプレイに表示された画面上に表示させるステップと
を実行させるプログラム。
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/787,252 US9875017B2 (en) | 2013-04-26 | 2014-04-24 | Device, method, and program |
JP2015513835A JP6058790B2 (ja) | 2013-04-26 | 2014-04-24 | 装置、方法、及びプログラム |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2013-093778 | 2013-04-26 | ||
JP2013093778 | 2013-04-26 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2014175395A1 true WO2014175395A1 (ja) | 2014-10-30 |
Family
ID=51791961
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2014/061606 WO2014175395A1 (ja) | 2013-04-26 | 2014-04-24 | 装置、方法、及びプログラム |
Country Status (3)
Country | Link |
---|---|
US (1) | US9875017B2 (ja) |
JP (1) | JP6058790B2 (ja) |
WO (1) | WO2014175395A1 (ja) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2018511867A (ja) * | 2015-03-23 | 2018-04-26 | ネイバー コーポレーションNAVER Corporation | モバイル機器のアプリケーション実行装置およびその方法 |
KR20180098681A (ko) * | 2016-02-08 | 2018-09-04 | 미쓰비시덴키 가부시키가이샤 | 입력 표시 제어 장치, 입력 표시 제어 방법 및 입력 표시 시스템 |
WO2020196561A1 (ja) * | 2019-03-26 | 2020-10-01 | 株式会社東海理化電機製作所 | 操作装置 |
KR20210073591A (ko) * | 2018-12-07 | 2021-06-18 | 미쓰비시덴키 가부시키가이샤 | 입력 표시 제어 장치, 입력 표시 제어 방법 및 입력 표시 시스템 |
Families Citing this family (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR102148809B1 (ko) * | 2013-04-22 | 2020-08-27 | 삼성전자주식회사 | 단축 아이콘 윈도우 표시 장치, 방법 및 컴퓨터 판독 가능한 기록 매체 |
JP6676913B2 (ja) * | 2015-09-30 | 2020-04-08 | ブラザー工業株式会社 | 情報処理装置、および制御プログラム |
US20170195736A1 (en) | 2015-12-31 | 2017-07-06 | Opentv, Inc. | Systems and methods for enabling transitions between items of content |
RU2658803C2 (ru) * | 2016-10-28 | 2018-06-22 | Общество с ограниченной ответственностью "ПИРФ" (ООО "ПИРФ") | Способ и система вызова интерфейса пользователя на экране электронного устройства |
JP6951940B2 (ja) | 2017-10-26 | 2021-10-20 | 東芝テック株式会社 | コンテンツ表示制御装置及びその情報処理プログラム |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20130024805A1 (en) * | 2011-07-19 | 2013-01-24 | Seunghee In | Mobile terminal and control method of mobile terminal |
JP2013041512A (ja) * | 2011-08-18 | 2013-02-28 | Kyocera Corp | 携帯電子機器、制御方法、および、制御プログラム |
WO2013039023A1 (ja) * | 2011-09-15 | 2013-03-21 | Necカシオモバイルコミュニケーションズ株式会社 | 電子付箋の書込情報処理装置及び方法 |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
TW200805131A (en) * | 2006-05-24 | 2008-01-16 | Lg Electronics Inc | Touch screen device and method of selecting files thereon |
US7864163B2 (en) * | 2006-09-06 | 2011-01-04 | Apple Inc. | Portable electronic device, method, and graphical user interface for displaying structured electronic documents |
US8519963B2 (en) | 2007-01-07 | 2013-08-27 | Apple Inc. | Portable multifunction device, method, and graphical user interface for interpreting a finger gesture on a touch screen display |
JP5549588B2 (ja) * | 2008-07-25 | 2014-07-16 | 日本電気株式会社 | 電子付箋システム |
US9285980B2 (en) * | 2012-03-19 | 2016-03-15 | Htc Corporation | Method, apparatus and computer program product for operating items with multiple fingers |
-
2014
- 2014-04-24 WO PCT/JP2014/061606 patent/WO2014175395A1/ja active Application Filing
- 2014-04-24 JP JP2015513835A patent/JP6058790B2/ja not_active Expired - Fee Related
- 2014-04-24 US US14/787,252 patent/US9875017B2/en not_active Expired - Fee Related
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20130024805A1 (en) * | 2011-07-19 | 2013-01-24 | Seunghee In | Mobile terminal and control method of mobile terminal |
JP2013041512A (ja) * | 2011-08-18 | 2013-02-28 | Kyocera Corp | 携帯電子機器、制御方法、および、制御プログラム |
WO2013039023A1 (ja) * | 2011-09-15 | 2013-03-21 | Necカシオモバイルコミュニケーションズ株式会社 | 電子付箋の書込情報処理装置及び方法 |
Non-Patent Citations (1)
Title |
---|
SANZUI: "Sticky! (Fusen Memo), d-Market Appli & Review", 21 April 2013 (2013-04-21), Retrieved from the Internet <URL:http://app.dcm-gate.com/appreview/001f7sc> [retrieved on 20140703] * |
Cited By (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10425521B2 (en) | 2015-03-23 | 2019-09-24 | Naver Corporation | Apparatus and method for executing application for mobile device |
JP2018511867A (ja) * | 2015-03-23 | 2018-04-26 | ネイバー コーポレーションNAVER Corporation | モバイル機器のアプリケーション実行装置およびその方法 |
KR102174565B1 (ko) | 2016-02-08 | 2020-11-05 | 미쓰비시덴키 가부시키가이샤 | 표시 제어 장치, 입력 표시 시스템, 표시 제어 방법 및 프로그램 |
KR101981439B1 (ko) * | 2016-02-08 | 2019-05-22 | 미쓰비시덴키 가부시키가이샤 | 입력 표시 제어 장치, 입력 표시 제어 방법 및 입력 표시 시스템 |
KR20190055273A (ko) * | 2016-02-08 | 2019-05-22 | 미쓰비시덴키 가부시키가이샤 | 표시 제어 장치, 입력 표시 시스템, 표시 제어 방법 및 프로그램 |
KR20180098681A (ko) * | 2016-02-08 | 2018-09-04 | 미쓰비시덴키 가부시키가이샤 | 입력 표시 제어 장치, 입력 표시 제어 방법 및 입력 표시 시스템 |
US10884612B2 (en) | 2016-02-08 | 2021-01-05 | Mitsubishi Electric Corporation | Input display control device, input display control method, and input display system |
KR20210073591A (ko) * | 2018-12-07 | 2021-06-18 | 미쓰비시덴키 가부시키가이샤 | 입력 표시 제어 장치, 입력 표시 제어 방법 및 입력 표시 시스템 |
KR20220031770A (ko) * | 2018-12-07 | 2022-03-11 | 미쓰비시덴키 가부시키가이샤 | 입력 표시 제어 장치, 입력 표시 제어 방법 및 입력 표시 시스템 |
KR102413747B1 (ko) * | 2018-12-07 | 2022-06-27 | 미쓰비시덴키 가부시키가이샤 | 입력 표시 제어 장치, 입력 표시 제어 방법 및 입력 표시 시스템 |
US11393230B2 (en) | 2018-12-07 | 2022-07-19 | Mitsubishi Electric Corporation | Input display control device, input display control method, and input display system |
KR102603900B1 (ko) * | 2018-12-07 | 2023-11-17 | 미쓰비시덴키 가부시키가이샤 | 입력 표시 제어 장치, 입력 표시 제어 방법 및 입력 표시 시스템 |
WO2020196561A1 (ja) * | 2019-03-26 | 2020-10-01 | 株式会社東海理化電機製作所 | 操作装置 |
Also Published As
Publication number | Publication date |
---|---|
US9875017B2 (en) | 2018-01-23 |
US20160077702A1 (en) | 2016-03-17 |
JP6058790B2 (ja) | 2017-01-11 |
JPWO2014175395A1 (ja) | 2017-02-23 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5775445B2 (ja) | 装置、方法、及びプログラム | |
JP5715042B2 (ja) | 装置、方法、及びプログラム | |
JP6017891B2 (ja) | 装置、方法、及びプログラム | |
JP6002012B2 (ja) | 装置、方法、及びプログラム | |
JP5891083B2 (ja) | 装置、方法、及びプログラム | |
JP5809963B2 (ja) | 装置、方法、及びプログラム | |
JP6110654B2 (ja) | 装置、方法、及びプログラム | |
JP6058790B2 (ja) | 装置、方法、及びプログラム | |
JP2013131185A (ja) | 装置、方法、及びプログラム | |
JP2013200681A (ja) | 装置、方法、及びプログラム | |
JP2013134694A (ja) | 装置、方法、及びプログラム | |
WO2014054801A1 (ja) | 電子機器、制御方法及び制御プログラム | |
JP2013200680A (ja) | 装置、方法、及びプログラム | |
JP5859932B2 (ja) | 装置、方法、及びプログラム | |
JP5775432B2 (ja) | 装置、方法、及びプログラム | |
JP5762885B2 (ja) | 装置、方法、及びプログラム | |
JP2013065290A (ja) | 装置、方法、及びプログラム | |
JP2013092891A (ja) | 装置、方法、及びプログラム | |
JP5959372B2 (ja) | 装置、方法、及びプログラム | |
JP6080355B2 (ja) | 装置、方法及びプログラム | |
JP5848971B2 (ja) | 装置、方法、及びプログラム | |
JP2013131186A (ja) | 装置、方法、及びプログラム | |
JP2013182596A (ja) | 装置、方法、及びプログラム | |
JP2013092974A (ja) | 装置、方法及びプログラム | |
JP5740366B2 (ja) | 装置、方法、及びプログラム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 14787815 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2015513835 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 14787252 Country of ref document: US |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 14787815 Country of ref document: EP Kind code of ref document: A1 |