US20140078110A1 - Sensing apparatus for user terminal using camera, sensing method for the same and controlling method for the same - Google Patents

Sensing apparatus for user terminal using camera, sensing method for the same and controlling method for the same Download PDF

Info

Publication number
US20140078110A1
US20140078110A1 US13/932,072 US201313932072A US2014078110A1 US 20140078110 A1 US20140078110 A1 US 20140078110A1 US 201313932072 A US201313932072 A US 201313932072A US 2014078110 A1 US2014078110 A1 US 2014078110A1
Authority
US
United States
Prior art keywords
camera
user terminal
image
variable
touch
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US13/932,072
Inventor
Sung Jae Hwang
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Korea Advanced Institute of Science and Technology KAIST
Original Assignee
Korea Advanced Institute of Science and Technology KAIST
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Korea Advanced Institute of Science and Technology KAIST filed Critical Korea Advanced Institute of Science and Technology KAIST
Assigned to KOREA ADVANCED INSTITUTE OF SCIENCE AND TECHNOLOGY reassignment KOREA ADVANCED INSTITUTE OF SCIENCE AND TECHNOLOGY ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: HWANG, SUNG JAE
Publication of US20140078110A1 publication Critical patent/US20140078110A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/0304Detection arrangements using opto-electronic means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • G06F3/0425Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means using a single imaging device like a video camera for tracking the absolute position of a single or a plurality of objects with respect to an imaged reference surface, e.g. video camera imaging a display or a projection screen, a table or a wall surface, on which a computer generated image is displayed or projected
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/62Control of parameters via user interfaces

Definitions

  • the present disclosure relates to a sensing apparatus for a user terminal using a camera, a sensing method for the same, and a controlling method for the same.
  • the present disclosure relates to a sensing apparatus for a user terminal using a camera, which has advantages that the function of the camera is used not only simply as an imaging unit of an outside object but also for various operations and controls of the user terminal, and by touching the camera generally provided in the rear surface of a touchscreen at an arbitrary strength by a user, the terminal can be operated and controlled without a touchscreen occlusion problem, a sensing method for the same, and a controlling method for the same.
  • a touchscreen is a screen configured to, when a person's hand or an object comes into contact with a letter or a specific position shown on a screen without using a keyboard, detect the position and directly receive the input data so as to perform a specific process by stored software.
  • the touchscreen is widely applied as an input screen of a user terminal such as a cellphone and a computer.
  • the touchscreens are generally classified into a resistive type and an electrostatic type (capacitive type).
  • the resistive type touchscreen detects a pressure change on the touchscreen and has an advantage having of no limitation on a touch device.
  • the capacitive type touchscreen detects a touch using minute current that flows through the human body, and has advantages having of a multi-touch function and fast response.
  • user terminals such as cellphones are provided with imaging units such as cameras.
  • the camera is a resource that obtains visual information for a mobile device such as a cellphone, and is an electronic component that is essentially provided and used in the user terminal for video calls, an augmented reality, and the like.
  • the application range of the camera has been limited to picture and video recognition, QR code recognition, and the like. Therefore, due to the characteristics of the user terminal such as a cellphone, which has to perform various functions with a gradually reduced size, various applications of components such as the camera which is already included are required.
  • it is difficult to provide more physical buttons for the user terminal such as a cellphone in the touchscreen environment there is an urgent need for using the existing components provided in the cellphone in various ways.
  • the present disclosure is directed to providing a new application method of a component such as a camera.
  • a sensing apparatus for a user terminal including a camera includes: an image analyzing unit analyzing an image variable of an object taken by the camera; and a touch determining unit determining whether the object touches the user terminal and an intensity of the touch according to the image variable analyzed by the image analyzing unit.
  • the object may be an input unit of the user terminal, and the input unit of the user terminal may come into contact with the camera.
  • the image variable may be a variable which varies with the intensity of the touch between the object and the camera.
  • the image variable may be a variable which varies with a transmittance of light being transmitted through the object and received by the camera.
  • the variable may be one of luminance, a luminance distribution, a color tone, an amount of a specific color, a distribution of a specific color, illuminance, and light intensity of an image of the object taken by the camera.
  • the touch determining unit may determine that the intensity of the touch of the object on the user terminal is high.
  • variable may be the luminance of the Image
  • touch determining unit may determine that the intensity of the touch of the object on the user terminal is higher as the luminance of the image is higher.
  • the sensing apparatus for a user terminal including a camera may further include: a rotation determining unit determining a degree of rotation of the object taken by the camera according to the image variable analyzed by the image analyzing unit.
  • the rotation determining unit ( 130 ) may determine a rotational angle of the object according to a direction pattern of the image variable analyzed by the image analyzing unit.
  • the image analyzing unit may analyze the image variables for the entire image or a part of pixels thereof.
  • a user terminal includes: the sensing apparatus described above.
  • a sensing method for a user terminal including a camera includes: analyzing an image variable of an object taken by the camera; and determining an intensity of a touch of the object on the user terminal according to the analyzed image variable.
  • the object may come into contact with the camera, and the image variable may be a variable which varies with a transmittance of light transmitted through the object.
  • the image variable may be one of luminance, a luminance distribution, a color tone, an amount of a specific color, a distribution of a specific color, illuminance, and light intensity of the taken image of the object.
  • the sensing method for a user terminal including a camera may further include: determining a rotational angle of the object according to a direction pattern of the analyzed image variable.
  • a computer readable recording medium having recorded thereon a program instruction for implementing the sensing method for a user terminal including a camera described above.
  • a controlling method for a user terminal including a camera includes: analyzing an image variable of an object taken by the camera; determining an intensity of a touch of the object on the user terminal according to the analyzed image variable; and executing a control command for the user terminal corresponding to the analyzed intensity of the touch.
  • the object may come into contact with the camera
  • the image variable may be a variable which varies with a transmittance of light transmitted through the object
  • the image variable may be one of luminance, a luminance distribution, a color tone, an amount of a specific color, a distribution of a specific color, illuminance, and light intensity of the taken image of the object.
  • a computer readable recording medium having recorded thereon a program instruction for implementing the controlling method for a user terminal including a camera described above.
  • the touch pressure, rotational angle, and the like of the object regarding the user terminal may be sensed by using the camera already provided in the user terminal. Accordingly, the function of the camera is not only simply used as an imaging unit of an outside object but also extended to various operations and controls of the user terminal. Furthermore, there is an advantage that by touching the camera generally provided in the rear surface of a touchscreen at an arbitrary strength by a user, the terminal can be operated and controlled without a touchscreen occlusion problem.
  • FIGS. 1 to 4 are diagrams illustrating a method of sensing a touch intensity according to an embodiment of the present disclosure
  • FIG. 5 is an image taken by a camera as the intensity of touch on the camera is gradually increased in the same environment
  • FIG. 6 is a block diagram of a sensing apparatus for a user terminal including a camera according to an embodiment of the present disclosure
  • FIG. 7 is a block diagram of the sensing apparatus for a user terminal including a camera according to an embodiment of the present disclosure
  • FIGS. 8 and 9 are photographs illustrating luminance changing patterns which vary depending on the direction of a finger according to an embodiment of the present disclosure
  • FIG. 10 is a flowchart of a sensing method for a user terminal according to an embodiment of the present disclosure.
  • FIG. 11 is a flowchart of a controlling method for a user terminal according to an embodiment of the present disclosure.
  • the present disclosure has focused on the fact that, in a case where a camera comes into contact with an object, a degree of transmission of light is changed by the touch intensity of the object, and accordingly image variables (for example, luminance, illuminance, and the like) of an image taken by the camera are changed. That is, in a case where the camera is touched by a user input unit such as a finger, the transmission amount of light that reaches the camera is determined by the touch intensity of the user input unit, and in the present disclosure, image variables that are changed by the transmission amount of light are determined to recognize the touch intensity of the user input unit.
  • image variables for example, luminance, illuminance, and the like
  • FIGS. 1 to 4 are diagrams illustrating a method of sensing a touch intensity according to an embodiment of the present disclosure.
  • the image of FIG. 1 is an image (an image on the right) of a finger taken by a camera when a user touches the camera 11 provided in a user terminal 10 with the finger (an image on the left), and shows a color according to the transmittance of light recognized at the time of touching the camera in a room.
  • FIGS. 2 to 4 are images of the finger taken by the camera as the pushing strength of the finger is gradually increased in the same environment as that of FIG. 1 .
  • the image variables in the present disclosure are variables that vary with touch pressure strength between the object and the camera, and the image variables vary with light transmitted between the camera and the object.
  • the image variables in the present disclosure are variables that vary with touch pressure strength between the object and the camera, and the image variables vary with light transmitted between the camera and the object.
  • the amount of light transmitted through the object is increased as the pushing strength is increase.
  • the luminance of the image taken by the camera is greatly increased.
  • the touch intensity is graded by the luminance in FIGS. 2 to 4
  • the touch intensity may also be graded by a degree or distribution of red color. Therefore, any arbitrary image variables that vary with the touch intensity between an object and a camera which come in contact with each other may be used in the present disclosure, and the image variables may be used individually or in a combination thereof.
  • FIG. 5 is an image taken by the camera as the intensity of a touch against the camera is gradually increased in the same environment.
  • luminance is gradually increased, or red image elements are increased in the image taken by the camera as the touch intensity is increased. That is, this change results from the transmittance of light that varies with the increase in the touch pressure.
  • FIG. 6 is a block diagram of the sensing apparatus for a user terminal including camera according to an embodiment of the present disclosure.
  • the sensing apparatus 100 for the user terminal includes an image analyzing unit 110 which analyzes the image variables of an object taken by the camera, and a touch determining unit 120 which determines whether or not the object touches the user terminal and determines the touch intensity according to the image variables analyzed by the image analyzing unit 110 .
  • the object is the input unit of the user terminal, and the input unit of the user terminal comes into contact with the camera.
  • the touch intensity is subjected to an inverse operation by the image variables which are changed by the transmittance of light transmitted through the object.
  • the touch determining unit 120 of the sensing apparatus analyzes that the touch intensity of the object against the user terminal is high.
  • the direction of the object that comes into contact with the camera provided in the user terminal is determined according to the pattern of the analyzed image.
  • FIG. 7 is a block diagram of the sensing apparatus for a user terminal including a camera according to an embodiment of the present disclosure.
  • the sensing apparatus further includes a rotation determining unit 130 which determines a degree of rotation of the object taken by the camera according to the image variables analyzed by the image analyzing unit 110 .
  • the rotation determining unit 130 determines the degree of rotation of the object according to a direction pattern of the image variable analyzed by the image analyzing unit 110 .
  • the direction pattern of the image variable may be a changing pattern of luminance, a direction pattern of luminance in a specific range, a color changing pattern, a direction pattern of a specific color element, or the like.
  • FIGS. 8 and 9 are photographs illustrating luminance changing patterns which vary depending on the direction of a finger according to an embodiment of the present disclosure.
  • the rotational angle of the user input unit (finger) which comes into contact with the camera may be determined according to the pattern of the change in luminance (or a change in which the bright color is shown) described above.
  • the touch intensity or the rotational angle is determined from the image variables of the object taken by the camera, and the image variables may include luminance, distribution of a specific color, amount, pattern, and the like.
  • the rotational angle of the user input unit such as a finger, or the like, for example, the color tone of a recognized screen, the distribution of a black area, luminance distribution, illuminance, light ratio, and the like may be employed as the image variables according to the present disclosure.
  • image variables may be extracted by self-flash or the light from the touchscreen.
  • the image analyzing unit 110 may analyze the image variables for not only the entire image, but also a part of pixels thereof. In this case, there is an advantage that time and resources are less spent than those for analyzing the image variables for the entire pixels.
  • FIG. 10 is a flowchart of a sensing method for a user terminal according to an embodiment of the present disclosure.
  • the image variables of the object taken by the camera of the user terminal are analyzed (S 110 ).
  • the image variables are changed by the degree of light transmitted through the user input unit, luminance of the entire image or a part of the pixels thereof, distribution or degree of a specific color element, color tone, distribution of black or white area, luminance distribution, illuminance, light intensity pattern, and the like may be employed as the light variables.
  • the user terminal may further include a memory unit (not shown) that stores the interrelation between touch intensity and image variables as an element of the sensing apparatus.
  • the sensing method for a user terminal including a camera may further include a stage of determining the rotational angle of the object according to the direction pattern of the analyzed image variables (S 130 ).
  • the touch intensity of the object for example, finger
  • the user may control the user terminal by touching the camera provided in the rear.
  • FIG. 11 is a flowchart of a controlling method for a user terminal according to an embodiment of the present disclosure.
  • the image variables of the object taken by the camera of the user terminal are analyzed (S 210 ). Since the image variables are the same as described above, description thereof will be omitted.
  • the object touches the user terminal and the touch intensity are determined according to the analyzed image variables (S 220 ). For example, in a case of a high luminance, it may be determined that the finger strongly pushes the camera, and in a case of a low luminance, the contrary may be determined.
  • the controlling method and the controlling apparatus perform a control command for the user terminal corresponding to the analyzed touch intensity (S 230 ).
  • a message indicating that the line is engaged may be transmitted to the caller by touching the camera provided in the rear.
  • the present disclosure by touching the camera in the rear, whether or not the camera is touched and the touch intensity are sensed, and accordingly the terminal may be effectively controlled and operated.
  • the user touches the camera by intentionally applying a force is very important to perform these functions.
  • the touch intensity at a specific strength or higher may be sensed to be used for controlling the user terminal.
  • the sensing apparatus for a user terminal, the controlling apparatus, and the like according to the present disclosure may be used in a user terminal such as a cellphone, and thus the scope of the present disclosure also includes such user terminals.
  • the above-described sensing method and the controlling method may be implemented as program instructions to perform the methods. Examples of the computer readable recording medium having the program instructions recorded thereon include ROM, RAM, CD-ROM, magnetic tapes, floppy disks, and optical media storage devices.
  • the computer readable recording medium having the above-described program recorded thereon can also be distributed over network coupled computer systems so that the computer readable code is stored and executed in a distributed fashion.
  • one or more computers among a number of distributed computers can execute a part of the functions suggested above, and transmit the execution results to one or more of the other distributed computers.
  • the computers which receives the results also execute a part of the functions suggested above, and can provide the results to the other distributed computers.
  • the computer which can read the recording medium having applications recorded thereon as the program for executing the sensing method for a user, the sensing apparatus, and the controlling method according to the embodiments of the present disclosure may include, as well as general PCs such as general desktops or notebooks, mobile terminals such as smartphones, tablet PCs, personal digital assistants (PDAs), and mobile communication terminals, and furthermore, should be construed as any devices that can perform computing.
  • general PCs such as general desktops or notebooks
  • mobile terminals such as smartphones, tablet PCs, personal digital assistants (PDAs), and mobile communication terminals, and furthermore, should be construed as any devices that can perform computing.
  • PDAs personal digital assistants
  • each of all the elements can be implemented as a single independent hardware, and a portion of or the entire elements may be selectively combined to be implemented as a computer program having a program module that executes a portion of or the entire functions combined in one or a plurality of units of hardware. Codes and code segments constituting the computer program will be easily construed by those skilled in the art to which the present disclosure pertains.
  • the computer program can be stored in a computer readable storage media and be read and executed by the computer to implement the embodiments of the present disclosure.
  • the storage media of the computer program may include magnetic storage media, optical storage media, and the like.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Studio Devices (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

Provided is a sensing apparatus for a user terminal including a camera, including: an image analyzing unit analyzing an image variable of an object taken by the camera; and a touch determining unit determining whether the object touches the user terminal and an intensity of the touch according to the image variable analyzed by the image analyzing unit. According to the present disclosure, without adding additional hardware devices or changing hardware devices, the touch pressure, rotational angle, and the like of the object regarding the user terminal may be sensed by using the camera already provided in the user terminal.

Description

    CROSS-REFERENCE TO RELATED APPLICATION
  • This application claims priority of Korean Patent Application No. 10-2012-0103450, filed on Sep. 18, 2012, in the KIPO (Korean Intellectual Property Office), the disclosure of which is incorporated herein entirely by reference.
  • BACKGROUND OF THE INVENTION
  • 1. Field of the Invention
  • The present disclosure relates to a sensing apparatus for a user terminal using a camera, a sensing method for the same, and a controlling method for the same. In particular, the present disclosure relates to a sensing apparatus for a user terminal using a camera, which has advantages that the function of the camera is used not only simply as an imaging unit of an outside object but also for various operations and controls of the user terminal, and by touching the camera generally provided in the rear surface of a touchscreen at an arbitrary strength by a user, the terminal can be operated and controlled without a touchscreen occlusion problem, a sensing method for the same, and a controlling method for the same.
  • 2. Description of the Related Art
  • A touchscreen is a screen configured to, when a person's hand or an object comes into contact with a letter or a specific position shown on a screen without using a keyboard, detect the position and directly receive the input data so as to perform a specific process by stored software. The touchscreen is widely applied as an input screen of a user terminal such as a cellphone and a computer.
  • The touchscreens are generally classified into a resistive type and an electrostatic type (capacitive type). The resistive type touchscreen detects a pressure change on the touchscreen and has an advantage having of no limitation on a touch device. The capacitive type touchscreen detects a touch using minute current that flows through the human body, and has advantages having of a multi-touch function and fast response.
  • However, in a case of such touchscreens, since providing information and inputting information are performed in the same place, there is a screen occlusion problem of occluding the screen during operations. In addition, since the capacitive type touchscreen commercialized until now may not effectively distinguish between tough pressures (strengths), and thus has a limit that only on-off touch control is possible. Furthermore, according to the related art, the input touch intensity is measured by the contact between a touch pad and an electrode. However, in this case, there is a problem of image quality degradation caused by a space between the touch pad and the electrode layer separated from each other.
  • On the other hand, with the development of electronic technologies, user terminals such as cellphones are provided with imaging units such as cameras. The camera is a resource that obtains visual information for a mobile device such as a cellphone, and is an electronic component that is essentially provided and used in the user terminal for video calls, an augmented reality, and the like. However, the application range of the camera has been limited to picture and video recognition, QR code recognition, and the like. Therefore, due to the characteristics of the user terminal such as a cellphone, which has to perform various functions with a gradually reduced size, various applications of components such as the camera which is already included are required. Furthermore, in a circumstance in which it is difficult to provide more physical buttons for the user terminal such as a cellphone in the touchscreen environment, there is an urgent need for using the existing components provided in the cellphone in various ways.
  • SUMMARY OF THE INVENTION
  • The present disclosure is directed to providing a new application method of a component such as a camera.
  • In one aspect, a sensing apparatus for a user terminal including a camera, includes: an image analyzing unit analyzing an image variable of an object taken by the camera; and a touch determining unit determining whether the object touches the user terminal and an intensity of the touch according to the image variable analyzed by the image analyzing unit.
  • In an embodiment, the object may be an input unit of the user terminal, and the input unit of the user terminal may come into contact with the camera.
  • In an embodiment, the image variable may be a variable which varies with the intensity of the touch between the object and the camera.
  • In an embodiment, the image variable may be a variable which varies with a transmittance of light being transmitted through the object and received by the camera.
  • In an embodiment, the variable may be one of luminance, a luminance distribution, a color tone, an amount of a specific color, a distribution of a specific color, illuminance, and light intensity of an image of the object taken by the camera.
  • In an embodiment, in a case where image variable analyzed by the image analyzing unit shows a pattern in which the light transmitted through the object and received by the camera is increased, the touch determining unit may determine that the intensity of the touch of the object on the user terminal is high.
  • In an embodiment, the variable may be the luminance of the Image, and the touch determining unit may determine that the intensity of the touch of the object on the user terminal is higher as the luminance of the image is higher.
  • In an embodiment, the sensing apparatus for a user terminal including a camera may further include: a rotation determining unit determining a degree of rotation of the object taken by the camera according to the image variable analyzed by the image analyzing unit.
  • In an embodiment, the rotation determining unit (130) may determine a rotational angle of the object according to a direction pattern of the image variable analyzed by the image analyzing unit.
  • In an embodiment, the image analyzing unit may analyze the image variables for the entire image or a part of pixels thereof.
  • In another aspect, a user terminal includes: the sensing apparatus described above.
  • In still another aspect, a sensing method for a user terminal including a camera, includes: analyzing an image variable of an object taken by the camera; and determining an intensity of a touch of the object on the user terminal according to the analyzed image variable.
  • In an embodiment, the object may come into contact with the camera, and the image variable may be a variable which varies with a transmittance of light transmitted through the object.
  • In an embodiment, the image variable may be one of luminance, a luminance distribution, a color tone, an amount of a specific color, a distribution of a specific color, illuminance, and light intensity of the taken image of the object.
  • In an embodiment, the sensing method for a user terminal including a camera may further include: determining a rotational angle of the object according to a direction pattern of the analyzed image variable.
  • In still another aspect, there is provided a computer readable recording medium having recorded thereon a program instruction for implementing the sensing method for a user terminal including a camera described above.
  • In still another aspect, a controlling method for a user terminal including a camera, includes: analyzing an image variable of an object taken by the camera; determining an intensity of a touch of the object on the user terminal according to the analyzed image variable; and executing a control command for the user terminal corresponding to the analyzed intensity of the touch.
  • In an embodiment, the object may come into contact with the camera, the image variable may be a variable which varies with a transmittance of light transmitted through the object, and the image variable may be one of luminance, a luminance distribution, a color tone, an amount of a specific color, a distribution of a specific color, illuminance, and light intensity of the taken image of the object.
  • In still another aspect, there is provided a computer readable recording medium having recorded thereon a program instruction for implementing the controlling method for a user terminal including a camera described above.
  • According to the present disclosure, without adding additional hardware devices or changing hardware devices, the touch pressure, rotational angle, and the like of the object regarding the user terminal may be sensed by using the camera already provided in the user terminal. Accordingly, the function of the camera is not only simply used as an imaging unit of an outside object but also extended to various operations and controls of the user terminal. Furthermore, there is an advantage that by touching the camera generally provided in the rear surface of a touchscreen at an arbitrary strength by a user, the terminal can be operated and controlled without a touchscreen occlusion problem.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The above and other features and advantages will become more apparent to those of ordinary skill in the art by describing in detail exemplary embodiments with reference to the attached drawings, in which:
  • FIGS. 1 to 4 are diagrams illustrating a method of sensing a touch intensity according to an embodiment of the present disclosure;
  • FIG. 5 is an image taken by a camera as the intensity of touch on the camera is gradually increased in the same environment;
  • FIG. 6 is a block diagram of a sensing apparatus for a user terminal including a camera according to an embodiment of the present disclosure;
  • FIG. 7 is a block diagram of the sensing apparatus for a user terminal including a camera according to an embodiment of the present disclosure;
  • FIGS. 8 and 9 are photographs illustrating luminance changing patterns which vary depending on the direction of a finger according to an embodiment of the present disclosure;
  • FIG. 10 is a flowchart of a sensing method for a user terminal according to an embodiment of the present disclosure; and
  • FIG. 11 is a flowchart of a controlling method for a user terminal according to an embodiment of the present disclosure.
  • In the following description, the same or similar elements are labeled with the same or similar reference numbers.
  • DETAILED DESCRIPTION
  • The present disclosure now will be described more fully hereinafter with reference to the accompanying drawings, in which embodiments of the invention are shown. This invention may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the invention to those skilled in the art. In addition, when the elements of each of the figures are denoted by reference numerals, it should be noted that like elements are denoted by like reference numerals although illustrated in different figures. In addition, in the description of the present disclosure, details of well-known configurations or functions may be omitted to avoid unnecessary obscuring the gist of the present disclosure.
  • The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. As used herein, the singular forms “a”, “an” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms “includes”, “comprises” and/or “comprising,” when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof. In addition, in the description of the elements of the present disclosure, the terms first, second, A, B, (a), and (b) can be used. The terms are used to distinguish one element from another, and are not intended to limit the nature, the order, or the turn of the corresponding elements. In a case where it is described that an element is “connected”, “joined”, or “linked” to other components, it will be understood that the component may be directly connected or linked to the other components, and another component may be “connected”, “joined”, or “linked” between the components.
  • Unless otherwise defined, all terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs. It will be further understood that terms, such as those defined in commonly used dictionaries, should be interpreted as having a meaning that is consistent with their meaning in the context of the relevant art and will not be interpreted in an idealized or overly formal sense unless expressly so defined herein.
  • Preferred embodiments will now be described more fully hereinafter with reference to the accompanying drawings. However, they may be embodied in different forms and should not be construed as limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the disclosure to those skilled in the art.
  • In order to solve the problems according to the related art, the present disclosure has focused on the fact that, in a case where a camera comes into contact with an object, a degree of transmission of light is changed by the touch intensity of the object, and accordingly image variables (for example, luminance, illuminance, and the like) of an image taken by the camera are changed. That is, in a case where the camera is touched by a user input unit such as a finger, the transmission amount of light that reaches the camera is determined by the touch intensity of the user input unit, and in the present disclosure, image variables that are changed by the transmission amount of light are determined to recognize the touch intensity of the user input unit.
  • FIGS. 1 to 4 are diagrams illustrating a method of sensing a touch intensity according to an embodiment of the present disclosure.
  • The image of FIG. 1 is an image (an image on the right) of a finger taken by a camera when a user touches the camera 11 provided in a user terminal 10 with the finger (an image on the left), and shows a color according to the transmittance of light recognized at the time of touching the camera in a room.
  • FIGS. 2 to 4 are images of the finger taken by the camera as the pushing strength of the finger is gradually increased in the same environment as that of FIG. 1.
  • Referring to FIGS. 1 to 4, it can be seen that luminance or a specific color (red) of the image is increased as the pushing strength of the finger is gradually increased. Therefore, the image variables in the present disclosure are variables that vary with touch pressure strength between the object and the camera, and the image variables vary with light transmitted between the camera and the object. Particularly, in a case of a user input unit such as a finger of which the thickness is changed with strength, the amount of light transmitted through the object is increased as the pushing strength is increase. Accordingly, as illustrated in FIGS. 2 to 4, in a case where the finger strongly pushes the camera, the luminance of the image taken by the camera is greatly increased.
  • Although the touch intensity is graded by the luminance in FIGS. 2 to 4, the touch intensity may also be graded by a degree or distribution of red color. Therefore, any arbitrary image variables that vary with the touch intensity between an object and a camera which come in contact with each other may be used in the present disclosure, and the image variables may be used individually or in a combination thereof.
  • FIG. 5 is an image taken by the camera as the intensity of a touch against the camera is gradually increased in the same environment.
  • Referring to FIG. 5, it can be seen that luminance is gradually increased, or red image elements are increased in the image taken by the camera as the touch intensity is increased. That is, this change results from the transmittance of light that varies with the increase in the touch pressure.
  • FIG. 6 is a block diagram of the sensing apparatus for a user terminal including camera according to an embodiment of the present disclosure.
  • Referring to FIG. 6, the sensing apparatus 100 for the user terminal includes an image analyzing unit 110 which analyzes the image variables of an object taken by the camera, and a touch determining unit 120 which determines whether or not the object touches the user terminal and determines the touch intensity according to the image variables analyzed by the image analyzing unit 110.
  • As described with reference to FIGS. 1 to 5, the object is the input unit of the user terminal, and the input unit of the user terminal comes into contact with the camera. In the present disclosure, the touch intensity is subjected to an inverse operation by the image variables which are changed by the transmittance of light transmitted through the object.
  • Therefore, in a case where the image variable analyzed by the image analyzing unit 110 shows a pattern formed as the light transmitted through the object is increased (for example, in a case where the luminance is increased, or the distribution or amount of a specific color is increased), the touch determining unit 120 of the sensing apparatus according to the present disclosure analyzes that the touch intensity of the object against the user terminal is high.
  • According to another embodiment of the present disclosure, the direction of the object that comes into contact with the camera provided in the user terminal is determined according to the pattern of the analyzed image.
  • FIG. 7 is a block diagram of the sensing apparatus for a user terminal including a camera according to an embodiment of the present disclosure.
  • Referring to FIG. 7, the sensing apparatus according to the embodiment of the present disclosure further includes a rotation determining unit 130 which determines a degree of rotation of the object taken by the camera according to the image variables analyzed by the image analyzing unit 110. The rotation determining unit 130 determines the degree of rotation of the object according to a direction pattern of the image variable analyzed by the image analyzing unit 110. For example, the direction pattern of the image variable may be a changing pattern of luminance, a direction pattern of luminance in a specific range, a color changing pattern, a direction pattern of a specific color element, or the like.
  • FIGS. 8 and 9 are photographs illustrating luminance changing patterns which vary depending on the direction of a finger according to an embodiment of the present disclosure.
  • It can be seen that in FIG. 8, the luminance is increased from the right to the left, and in FIG. 9, the luminance is increased from the left to the right. The rotational angle of the user input unit (finger) which comes into contact with the camera may be determined according to the pattern of the change in luminance (or a change in which the bright color is shown) described above.
  • In the present disclosure, the touch intensity or the rotational angle is determined from the image variables of the object taken by the camera, and the image variables may include luminance, distribution of a specific color, amount, pattern, and the like. However, unlike the above, arbitrary variables which are changed by the transmittance of light from the outside, the rotational angle of the user input unit such as a finger, or the like, for example, the color tone of a recognized screen, the distribution of a black area, luminance distribution, illuminance, light ratio, and the like may be employed as the image variables according to the present disclosure.
  • Furthermore, when there is no external light, image variables may be extracted by self-flash or the light from the touchscreen. In addition, in an embodiment of the present disclosure, the image analyzing unit 110 may analyze the image variables for not only the entire image, but also a part of pixels thereof. In this case, there is an advantage that time and resources are less spent than those for analyzing the image variables for the entire pixels.
  • FIG. 10 is a flowchart of a sensing method for a user terminal according to an embodiment of the present disclosure.
  • Referring to FIG. 10, in the sensing method for a user terminal according to an embodiment of the present disclosure, the image variables of the object taken by the camera of the user terminal are analyzed (S110). As described above, since the image variables are changed by the degree of light transmitted through the user input unit, luminance of the entire image or a part of the pixels thereof, distribution or degree of a specific color element, color tone, distribution of black or white area, luminance distribution, illuminance, light intensity pattern, and the like may be employed as the light variables.
  • Thereafter, whether or not the object touches the user terminal and the touch intensity are determined according to the analyzed image variables (S120). That is, in a case where the image variables show a pattern corresponding to a strong touch intensity, the strong touch intensity is determined according to the image variables, and in a case where the image variables show a pattern corresponding to a weak touch intensity, the weak touch intensity is determined according to the image variables. The user terminal according to an embodiment of the present disclosure may further include a memory unit (not shown) that stores the interrelation between touch intensity and image variables as an element of the sensing apparatus.
  • In addition, the sensing method for a user terminal including a camera may further include a stage of determining the rotational angle of the object according to the direction pattern of the analyzed image variables (S130).
  • According to the sensing method described above, the touch intensity of the object (for example, finger) that comes into contact with the camera may be determined. Particularly, during a call or when it is difficult to operate the terminal with the touchscreen, the user may control the user terminal by touching the camera provided in the rear.
  • FIG. 11 is a flowchart of a controlling method for a user terminal according to an embodiment of the present disclosure.
  • Referring to FIG. 11, the image variables of the object taken by the camera of the user terminal are analyzed (S210). Since the image variables are the same as described above, description thereof will be omitted.
  • Thereafter, whether or not the object touches the user terminal and the touch intensity are determined according to the analyzed image variables (S220). For example, in a case of a high luminance, it may be determined that the finger strongly pushes the camera, and in a case of a low luminance, the contrary may be determined.
  • Thereafter, the controlling method and the controlling apparatus according to the present disclosure perform a control command for the user terminal corresponding to the analyzed touch intensity (S230). For example, in a case where another call is received while the line is engaged, a message indicating that the line is engaged may be transmitted to the caller by touching the camera provided in the rear. Particularly, unlike the related art in which a touchscreen is operated at sight, in the present disclosure, by touching the camera in the rear, whether or not the camera is touched and the touch intensity are sensed, and accordingly the terminal may be effectively controlled and operated. Particularly, unlike a general touch, whether or not the user touches the camera by intentionally applying a force is very important to perform these functions. In the present disclosure, not only whether or not the camera is touched but also the touch intensity at a specific strength or higher may be sensed to be used for controlling the user terminal.
  • The sensing apparatus for a user terminal, the controlling apparatus, and the like according to the present disclosure may be used in a user terminal such as a cellphone, and thus the scope of the present disclosure also includes such user terminals. In addition, the above-described sensing method and the controlling method may be implemented as program instructions to perform the methods. Examples of the computer readable recording medium having the program instructions recorded thereon include ROM, RAM, CD-ROM, magnetic tapes, floppy disks, and optical media storage devices.
  • Furthermore, the computer readable recording medium having the above-described program recorded thereon can also be distributed over network coupled computer systems so that the computer readable code is stored and executed in a distributed fashion. In this case, one or more computers among a number of distributed computers can execute a part of the functions suggested above, and transmit the execution results to one or more of the other distributed computers. The computers which receives the results also execute a part of the functions suggested above, and can provide the results to the other distributed computers.
  • The computer which can read the recording medium having applications recorded thereon as the program for executing the sensing method for a user, the sensing apparatus, and the controlling method according to the embodiments of the present disclosure may include, as well as general PCs such as general desktops or notebooks, mobile terminals such as smartphones, tablet PCs, personal digital assistants (PDAs), and mobile communication terminals, and furthermore, should be construed as any devices that can perform computing.
  • While all the elements constituting the embodiments of the present disclosure have been described to be conjoined in one or to be conjoined to operate, the present disclosure is not limited the embodiments. That is, all the elements may be selectively conjoined in one or more to operate without departing from the spirit and scope of the present disclosure. In addition, each of all the elements can be implemented as a single independent hardware, and a portion of or the entire elements may be selectively combined to be implemented as a computer program having a program module that executes a portion of or the entire functions combined in one or a plurality of units of hardware. Codes and code segments constituting the computer program will be easily construed by those skilled in the art to which the present disclosure pertains. The computer program can be stored in a computer readable storage media and be read and executed by the computer to implement the embodiments of the present disclosure. The storage media of the computer program may include magnetic storage media, optical storage media, and the like.
  • It should be understood that the terms “includes”, “configures”, “has”, and the like mean that corresponding elements can be included unless the context clearly indicates otherwise, and can further include other elements rather than excluding other elements. Unless otherwise defined, all terms including technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which the present disclosure pertains. It will be further understood that terms, such as those defined in commonly used dictionaries, should be interpreted as having a meaning that is consistent with their meaning in the context of the relevant art and the present disclosure will not be interpreted in an idealized or overly formal sense unless expressly so defined herein.
  • The above description only exemplifies the technical spirit of the present disclosure, and it will be understood by those skilled in the art to which the present disclosure pertains that various modifications and changes may be made without departing from the essential characteristics of the present disclosure. Accordingly, the embodiments disclosed in the present disclosure are not intended to limit but describe the technical spirit of the present disclosure, and the range of the technical spirit of the present disclosure is not limited by the embodiments. The protection range of the present disclosure should be interpreted by the appended claims, and all technical spirits within the equivalent scope should be interpreted as being included in the scope of the present disclosure.
  • While the present disclosure has been described with reference to the embodiments illustrated in the figures, the embodiments are merely examples, and it will be understood by those skilled in the art that various changes in form and other embodiments equivalent thereto can be performed. Therefore, the technical scope of the disclosure is defined by the technical idea of the appended claims.
  • The drawings and the forgoing description gave examples of the present invention. The scope of the present invention, however, is by no means limited by these specific examples. Numerous variations, whether explicitly given in the specification or not, such as differences in structure, dimension, and use of material, are possible. The scope of the invention is at least as broad as given by the following claims.

Claims (20)

What is claimed is:
1. A sensing apparatus for a user terminal including a camera, comprising:
an image analyzing unit analyzing an image variable of an object taken by the camera; and
a touch determining unit determining whether the object touches the user terminal and an intensity of the touch according to the image variable analyzed by the image analyzing unit.
2. The sensing apparatus for a user terminal including a camera of claim 1, wherein the object is an input unit of the user terminal, and the input unit of the user terminal comes into contact with the camera.
3. The sensing apparatus for a user terminal including a camera of claim 1, wherein the image variable is a variable which varies with the intensity of the touch between the object and the camera.
4. The sensing apparatus for a user terminal including a camera of claim 1, wherein the image variable is a variable which varies with a transmittance of light being transmitted through the object and received by the camera.
5. The sensing apparatus for a user terminal including a camera of claim 4, wherein the variable is one of luminance, a luminance distribution, a color tone, an amount of a specific color, a distribution of a specific color, illuminance, and light intensity of an image of the object taken by the camera.
6. The sensing apparatus for a user terminal including a camera of claim 5, wherein, in a case where image variable analyzed by the image analyzing unit shows a pattern in which the light transmitted through the object and received by the camera is increased, the touch determining unit determines that the intensity of the touch of the object on the user terminal is high.
7. The sensing apparatus for a user terminal including a camera of claim 6, wherein the variable is the luminance of the image, and the touch determining unit determines that the intensity of the touch of the object on the user terminal is higher as the luminance of the image is higher.
8. The sensing apparatus for a user terminal including a camera of claim 1, further comprising:
a rotation determining unit determining a degree of rotation of the object taken by the camera according to the image variable analyzed by the image analyzing unit.
9. The sensing apparatus for a user terminal including a camera of claim 8, wherein the rotation determining unit determines a rotational angle of the object according to a direction pattern of the image variable analyzed by the image analyzing unit.
10. The sensing apparatus for a user terminal including a camera of claim 1, wherein the image analyzing unit analyzes the image variables for the entire image or a part of pixels thereof.
11. The sensing apparatus for a user terminal including a camera of claim 8, wherein the image variable is a variable which varies with a transmittance of light being transmitted through the object and received by the camera.
12. A sensing method for a user terminal including a camera, comprising:
analyzing an image variable of an object taken by the camera; and
determining an intensity of a touch of the object on the user terminal according to the analyzed image variable.
13. The sensing method for a user terminal including a camera of claim 12, wherein the object comes into contact with the camera, and the image variable is a variable which varies with a transmittance of light transmitted through the object.
14. The sensing method for a user terminal including a camera of claim 13, wherein the image variable is one of luminance, a luminance distribution, a color tone, an amount of a specific color, a distribution of a specific color, illuminance, and light intensity of the image taken of the object.
15. The sensing method for a user terminal including a camera of claim 14, further comprising:
determining a rotational angle of the object according to a direction pattern of the analyzed image variable.
16. The sensing method for a user terminal including a camera of claim 12, wherein the image variable is a variable which varies with the intensity of the touch between the object and the camera.
17. A controlling method for a user terminal including a camera, comprising:
analyzing an image variable of an object taken by the camera;
determining an intensity of a touch of the object on the user terminal according to the analyzed image variable; and
executing a control command for the user terminal corresponding to the analyzed intensity of the touch.
18. The controlling method for a user terminal including a camera of claim 17, wherein the object comes into contact with the camera, and the image variable is a variable which varies with a transmittance of light transmitted through the object.
19. The controlling method for a user terminal including a camera of claim 18, wherein the image variable is one of luminance, a luminance distribution, a color tone, an amount of a specific color, a distribution of a specific color, illuminance, and light intensity of the image taken of the object.
20. The controlling method for a user terminal including a camera of claim 17, wherein the image variable is a variable which varies with the intensity of the touch between the object and the camera.
US13/932,072 2012-09-18 2013-07-01 Sensing apparatus for user terminal using camera, sensing method for the same and controlling method for the same Abandoned US20140078110A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR10-2012-0103450 2012-09-18
KR1020120103450A KR101390090B1 (en) 2012-09-18 2012-09-18 Sensing apparatus for user terminal using camera, method for the same and controlling method for the same

Publications (1)

Publication Number Publication Date
US20140078110A1 true US20140078110A1 (en) 2014-03-20

Family

ID=50273935

Family Applications (2)

Application Number Title Priority Date Filing Date
US13/854,244 Expired - Fee Related US9348432B2 (en) 2012-09-18 2013-04-01 Transmittance based sensor
US13/932,072 Abandoned US20140078110A1 (en) 2012-09-18 2013-07-01 Sensing apparatus for user terminal using camera, sensing method for the same and controlling method for the same

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US13/854,244 Expired - Fee Related US9348432B2 (en) 2012-09-18 2013-04-01 Transmittance based sensor

Country Status (2)

Country Link
US (2) US9348432B2 (en)
KR (1) KR101390090B1 (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101616450B1 (en) * 2014-06-09 2016-05-11 (주) 펀매직 Device and method for embodying virtual button using camera and computer readable recording medium therefor
WO2016114536A1 (en) * 2015-01-13 2016-07-21 Samsung Electronics Co., Ltd. Camera activation and illuminance
CN110986917B (en) * 2016-06-13 2021-10-01 原相科技股份有限公司 Track sensing system and track sensing method thereof
KR20210070528A (en) * 2019-12-05 2021-06-15 삼성전자주식회사 Method of performing half-shutter function using optical object recognition and method of capturing image using the same

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6289230B1 (en) * 1998-07-07 2001-09-11 Lightouch Medical, Inc. Tissue modulation process for quantitative noninvasive in vivo spectroscopic analysis of tissues
US20090143688A1 (en) * 2007-12-03 2009-06-04 Junichi Rekimoto Information processing apparatus, information processing method and program
US20090312654A1 (en) * 2008-06-12 2009-12-17 Fujitsu Limited Guidance method, apparatus thereof, recording medium storing program thereof, and device

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8730169B2 (en) * 2009-10-29 2014-05-20 Pixart Imaging Inc. Hybrid pointing device
KR100995130B1 (en) 2008-06-09 2010-11-18 한국과학기술원 The system for recogniging of user touch pattern using touch sensor and accelerometer sensor
JP4796104B2 (en) 2008-08-29 2011-10-19 シャープ株式会社 Imaging apparatus, image analysis apparatus, external light intensity calculation method, image analysis method, imaging program, image analysis program, and recording medium
KR20110066364A (en) * 2009-12-11 2011-06-17 양홍천 A touching force recognition system of multi point touch screen
KR101071864B1 (en) * 2010-03-10 2011-10-10 전남대학교산학협력단 Touch and Touch Gesture Recognition System
KR20110107143A (en) * 2010-03-24 2011-09-30 삼성전자주식회사 Method and apparatus for controlling function of a portable terminal using multi-input
KR101117289B1 (en) 2010-08-23 2012-03-20 한국과학기술원 Method and device for estimating touch pressure using image, recording medium for the same, and mobile device comprising the same
KR101148233B1 (en) * 2010-11-11 2012-05-23 한국과학기술원 Method and device for estimating touch pressure using image, recording medium for the same, and mobile device comprising the same
US9176539B2 (en) * 2012-11-10 2015-11-03 Ebay Inc. Key input using an active pixel camera

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6289230B1 (en) * 1998-07-07 2001-09-11 Lightouch Medical, Inc. Tissue modulation process for quantitative noninvasive in vivo spectroscopic analysis of tissues
US20090143688A1 (en) * 2007-12-03 2009-06-04 Junichi Rekimoto Information processing apparatus, information processing method and program
US20090312654A1 (en) * 2008-06-12 2009-12-17 Fujitsu Limited Guidance method, apparatus thereof, recording medium storing program thereof, and device

Also Published As

Publication number Publication date
KR101390090B1 (en) 2014-05-27
US20140078041A1 (en) 2014-03-20
KR20140036832A (en) 2014-03-26
US9348432B2 (en) 2016-05-24

Similar Documents

Publication Publication Date Title
US11747958B2 (en) Information processing apparatus for responding to finger and hand operation inputs
EP2942705B1 (en) Method and device for controlling multiple displays
EP2783472B1 (en) Apparatus and method for providing dynamic fiducial markers for devices
EP3056982B1 (en) Terminal apparatus, display control method and recording medium
US8803820B2 (en) Touch detection device, electronic device and recording medium
US9164779B2 (en) Apparatus and method for providing for remote user interaction
KR101116442B1 (en) Apparatus, method and computer program product for manipulating a device using dual side input devices
RU2582854C2 (en) Method and device for fast access to device functions
US9939958B2 (en) Touch sensitive surface with recessed surface feature for an electronic device
WO2019041779A1 (en) Terminal interface switching, moving and gesture processing method and device and terminal
US9519365B2 (en) Display control apparatus and control method for the same
EP2613247B1 (en) Method and apparatus for displaying a keypad on a terminal having a touch screen
US9298305B2 (en) Display control apparatus and method
CN105094419B (en) Gloves touch detection
JP2019128961A (en) Method for recognizing fingerprint, and electronic device, and storage medium
KR20150080842A (en) Method for processing input and an electronic device thereof
US20140078110A1 (en) Sensing apparatus for user terminal using camera, sensing method for the same and controlling method for the same
US20190339858A1 (en) Method and apparatus for adjusting virtual key of mobile terminal
CN105579945A (en) Digital device and control method thereof
CN105739817A (en) Icon hiding method and device and mobile terminal
US10303333B2 (en) Mobile device, method for camera user interface, and non-transitory computer readable storage medium
US20210117080A1 (en) Method and apparatus for adjusting virtual key of mobile terminal
US9384574B2 (en) Image processing method and apparatus therefor
JP7197007B2 (en) Touch panel type information terminal device and its information input processing method
CN117008780A (en) Card position switching method and terminal

Legal Events

Date Code Title Description
AS Assignment

Owner name: KOREA ADVANCED INSTITUTE OF SCIENCE AND TECHNOLOGY

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:HWANG, SUNG JAE;REEL/FRAME:030717/0440

Effective date: 20130529

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION