KR20100107787A - Apparatus for processing command and method thereof - Google Patents

Apparatus for processing command and method thereof Download PDF

Info

Publication number
KR20100107787A
KR20100107787A KR1020090026048A KR20090026048A KR20100107787A KR 20100107787 A KR20100107787 A KR 20100107787A KR 1020090026048 A KR1020090026048 A KR 1020090026048A KR 20090026048 A KR20090026048 A KR 20090026048A KR 20100107787 A KR20100107787 A KR 20100107787A
Authority
KR
South Korea
Prior art keywords
unit
edges
input
command
stored
Prior art date
Application number
KR1020090026048A
Other languages
Korean (ko)
Inventor
정민규
Original Assignee
엘지전자 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 엘지전자 주식회사 filed Critical 엘지전자 주식회사
Priority to KR1020090026048A priority Critical patent/KR20100107787A/en
Publication of KR20100107787A publication Critical patent/KR20100107787A/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1626Constructional details or arrangements for portable computers with a single-body enclosure integrating a flat display, e.g. Personal Digital Assistants [PDAs]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1637Details related to the display arrangement, including those related to the mounting of the display in the housing
    • G06F1/1643Details related to the display arrangement, including those related to the mounting of the display in the housing the display being associated to a digitizer, e.g. laptops that can be used as penpads
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • Telephone Function (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

PURPOSE: An apparatus for processing a command and a method thereof are provided to easily input a command, thereby increasing the convenience of a user. CONSTITUTION: If a user inputs an edge through a display unit, a control unit checks the number of edges(S110). The control unit executes a function stored in a storage unit based on the number of checked edges(S120). A command processing device comprises an input unit, a sensor unit, the storage unit, the display unit, a voice output unit, the control unit, and a communications unit. The input unit receives a command or a control signal by the touch/scroll of a screen or manipulation of a button.

Description

Instruction processing device and method therefor {APPARATUS FOR PROCESSING COMMAND AND METHOD THEREOF}

The present invention relates to an instruction processing apparatus and a method thereof.

In general, a command processing device is a device that receives a command input by a user's operation and processes it.

An object of the present invention is to provide a command processing apparatus and a method for providing an easy and convenient command input method.

Another object of the present invention is to provide a command processing apparatus and method for receiving a command by a simple operation and performing a pre-stored function corresponding to the input command.

Another object of the present invention is to provide a command processing apparatus and method for receiving a command using a touch input method and performing a pre-stored function corresponding to the input command.

The instruction processing method according to the present invention for achieving the above objects comprises the steps of: checking the number of edges or the number of points from the input content; And performing a pre-stored function corresponding to the identified number of edges or number of points.

In order to achieve the above object, a command processing method according to the present invention includes: checking a vibration frequency and a vibration direction of a terminal; And performing a pre-stored function corresponding to the identified frequency and vibration direction.

In accordance with another aspect of the present invention, a command processing method includes: detecting a movement of a user input through a camera; Checking the number of edges according to the detected movement of the user; And performing a pre-stored function corresponding to the identified edge number.

According to an aspect of the present invention, there is provided a command processing apparatus including: a display unit configured to receive contents by a user's manipulation; And a controller which checks the number of edges in the input content and performs a pre-stored function corresponding to the identified number of edges.

The command processing apparatus and method thereof according to an embodiment of the present invention provide an easy and convenient command input method, thereby increasing user convenience and making intuitive menu selection.

In addition, the command processing apparatus and method according to an embodiment of the present invention, by receiving a command using a touch input method or by a simple operation, by performing a pre-stored function corresponding to the input command, frequently used This has the effect that a menu or command can be executed by a simple operation.

Hereinafter, exemplary embodiments of the present invention will be described in detail with reference to the accompanying drawings, and the same or corresponding components will be denoted by the same reference numerals regardless of the reference numerals and redundant description thereof will be omitted.

1 is a block diagram showing the configuration of a mobile terminal 100 for explaining an instruction processing apparatus according to an embodiment of the present invention.

The mobile terminal 100 may be implemented in various forms. For example, the mobile terminal 100 may be a mobile phone, a smart phone, a notebook computer, a digital broadcasting terminal, a personal digital assistant (PDA), a portable multimedia player (PMP), or a navigation (vehicle navigation device). , Etc.

As shown in FIG. 1, the mobile terminal 100 includes a wireless communication unit 110, an A / V input unit 120, a user input unit 130, a sensing unit 140, and an output unit 150. , A memory 160, an interface unit 170, a controller 180, a power supply unit 190, and the like. Not all components of the mobile terminal 100 shown in FIG. 1 are essential components, and the mobile terminal 100 may be implemented by more components than those shown in FIG. 1, and fewer components. The mobile terminal 100 may also be implemented.

The wireless communication unit 110 may include one or more components for performing wireless communication between the mobile terminal 100 and the wireless communication system or wireless communication between the mobile terminal 100 and the network in which the mobile terminal 100 is located. have. For example, the wireless communication unit 110 may include a broadcast receiving module 111, a mobile communication module 112, a wireless internet module 113, a short range communication module 114, a location information module 115, and the like. .

The broadcast receiving module 111 receives a broadcast signal and / or broadcast related information from an external broadcast management server through a broadcast channel. The broadcast channel may include a satellite channel, a terrestrial channel, and the like. The broadcast management server may mean a server that generates and transmits a broadcast signal and / or broadcast related information or a server that receives a pre-generated broadcast signal and / or broadcast related information and transmits the same to the mobile terminal 100. The broadcast related information may mean information related to a broadcast channel, a broadcast program, or a broadcast service provider. The broadcast signal may include not only a TV broadcast signal, a radio broadcast signal, and a data broadcast signal, but also a broadcast signal having a data broadcast signal combined with a TV broadcast signal or a radio broadcast signal.

Meanwhile, the broadcast related information may be provided through a mobile communication network, and in this case, may be received by the mobile communication module 112. The broadcast related information may exist in various forms. For example, it may exist in the form of Electronic Program Guide (EPG) of Digital Multimedia Broadcasting (DMB) or Electronic Service Guide (ESG) of Digital Video Broadcast-Handheld (DVB-H).

The broadcast receiving module 111 receives broadcast signals using various broadcasting systems, and in particular, digital multimedia broadcasting-terrestrial (DMB-T), digital multimedia broadcasting-satellite (DMB-S), and media forward link (MediaFLO). Digital broadcast signals can be received using digital broadcasting systems such as only), digital video broadcast-handheld (DVB-H), integrated services digital broadcast-terrestrial (ISDB-T), and the like. Of course, the broadcast receiving module 111 is configured to be suitable for all broadcasting systems that provide broadcasting signals as well as the digital broadcasting system described above. The broadcast signal and / or broadcast related information received through the broadcast receiving module 111 may be stored in the memory 160.

The mobile communication module 112 transmits and receives a radio signal with at least one of a base station, an external terminal, and a server on a mobile communication network. The wireless signal may include various types of data according to transmission and reception of voice call signals, video call signals, and / or text / multimedia messages.

The wireless internet module 113 refers to a module for wireless internet access, and the wireless internet module 113 may be embedded or external to the mobile terminal 100. Here, wireless Internet technologies include wireless LAN (WLAN), Wi-Fi, WiBro, WiMAX, World Interoperability for Microwave Access (WMAX), HSDPA (High Speed Downlink Packet Access), and the like. Can be used.

The short range communication module 114 means a module for short range communication. Here, as short range communication technology, Bluetooth, Radio Frequency Identification (RFID), Infrared Data Association (IrDA), Ultra Wideband (UWB), ZigBee, etc. may be used. .

The location information module 115 is a module for checking or obtaining the location of the mobile terminal. An example is a GPS module. The GPS module receives location information from a plurality of satellites. Here, the location information may include coordinate information represented by latitude and longitude. For example, the GPS module can measure the exact time and distance from three or more satellites and calculate three different distances accurately according to triangulation. A method of obtaining distance and time information from three satellites and correcting the error with one satellite can be used. In particular, the GPS module can obtain not only the location of latitude, longitude, and altitude but also accurate time together with three-dimensional speed information from the location information received from the satellite. As the location information module 115, a Wi-Fi Positioning System and / or a Hybrid Positioning System may be applied.

The A / V input unit 120 is for inputting an audio signal or a video signal, and the A / V input unit 120 includes a camera 121 and a microphone 122. May be included. The camera 121 processes an image frame such as a still image or a video obtained by an image sensor in a video call mode or a photographing mode. The processed image frame may be displayed on the display unit 151.

The image frame processed by the camera 121 may be stored in the memory 160 or transmitted to the outside through the wireless communication unit 110. Two or more cameras 121 may be configured according to the configuration type and / or usage environment of the mobile terminal.

The microphone 122 receives an external sound signal by a microphone in a call mode, a recording mode, a voice recognition mode, etc., and processes the external sound signal into electrical voice data. In the call mode, the processed voice data may be converted into a form transmittable to the mobile communication base station through the mobile communication module 112 and output. The microphone 122 may implement various noise removing algorithms for removing noise generated in the process of receiving an external sound signal.

The user input unit 130 generates input data for the user to control the operation of the mobile terminal. The user input unit 130 may include a key pad, a dome switch, a touch pad (static pressure / capacitance), a jog wheel, a jog switch, and the like.

The sensing unit 140 may be a mobile terminal 100 such as an open / closed state of the mobile terminal 100, a location of the mobile terminal 100, a presence or absence of a user contact, an orientation of the mobile terminal 100, an acceleration / deceleration of the mobile terminal 100, or the like. The sensing state of the mobile terminal 100 generates a sensing signal for controlling the operation of the mobile terminal 100. For example, when the mobile terminal 100 is a slide phone type, it may sense whether the slide phone is opened or closed. In addition, the sensing unit 140 is responsible for sensing functions related to whether the power supply unit 190 is supplied with power, whether the interface unit 170 is coupled to an external device, and the like. The sensing unit 140 may include a proximity sensor 141.

The output unit 150 is for outputting an audio signal (or a signal related to hearing) or a video signal (or a signal related to time) or an alarm signal or a signal related to a tactile sense, and the output unit 150. ) May include a display unit 151, a sound output module 152, an alarm unit 153, and a haptic module 154.

The display unit 151 displays (or outputs) information processed by the mobile terminal 100. For example, when the mobile terminal 100 is in a call mode, the mobile terminal 100 displays a user interface (UI) or a graphic user interface (GUI) related to the call. When the mobile terminal 100 is in a video call mode or a photographing mode, the mobile terminal 100 displays a photographed and / or received image, a UI, or a GUI.

The display unit 151 may include a liquid crystal display (LCD), a thin film transistor liquid crystal display (TFT LCD), an organic light-emitting diode (OLED), and a flexible display. It may include at least one of a flexible display and a 3D display.

Some of these displays may be configured as transparent or light transmissive so that they can be seen through them. This may be referred to as a transparent display. A representative example of the transparent display is TOLED (Transparant OLED). The rear structure of the display unit 151 may also be configured as a light transmissive structure. With this structure, the user can see the object located behind the terminal body through the area occupied by the display unit 151 of the terminal body.

According to an implementation form of the mobile terminal 100, two or more display units 151 may exist. For example, a plurality of display units may be spaced apart or integrally disposed on one surface (same surface) in the mobile terminal 100, or may be disposed on different surfaces.

On the other hand, when the display unit 151 and a sensor for detecting a touch operation (hereinafter referred to as a touch sensor) form a mutual layer structure (hereinafter referred to as a touch screen), the display unit 151 Can be used as an input device in addition to the output device. The touch sensor may have, for example, a form of a touch film, a touch sheet, a touch pad, or the like.

In addition, the touch sensor may be configured to convert a change in pressure applied to a specific portion of the display unit 151 or a capacitance generated in a specific portion of the display unit 151 into an electrical input signal. The touch sensor may be configured to detect not only the position and area of the touch but also the pressure at the touch. When there is a touch input to the touch sensor, the corresponding signal (s) is sent to a touch controller (not shown). The touch controller processes the signal (s) and then transmits the corresponding data to the controller 180. As a result, the controller 180 can determine which area of the display unit 151 is touched.

The proximity sensor 141 may be disposed in the inner region of the mobile terminal 100 surrounded by the touch screen or near the touch screen. The proximity sensor 141 refers to a sensor that detects the presence or absence of an object approaching a predetermined detection surface or an object present in the vicinity without using mechanical force by using an electromagnetic force or infrared rays. The proximity sensor 141 has a longer life and higher utilization than a contact sensor.

Examples of the proximity sensor 141 include a transmission photoelectric sensor, a direct reflection photoelectric sensor, a mirror reflection photoelectric sensor, a high frequency oscillation proximity sensor, a capacitive proximity sensor, a magnetic proximity sensor, and an infrared proximity sensor. When the touch screen is capacitive, the touch screen is configured to detect the proximity of the pointer by a change in an electric field according to the proximity of the pointer. In this case, the touch screen (touch sensor) may be classified as a proximity sensor.

Hereinafter, for convenience of explanation, the act of allowing the pointer to be recognized without being in contact with the touch screen so that the pointer is located on the touch screen is referred to as "Proximity Touch", and the touch The act of actually touching the pointer on the screen is called "Contact Touch". The position of the proximity touch with the pointer on the touch screen means a position where the pointer is perpendicular to the touch screen when the pointer is in proximity touch.

In addition, the proximity sensor 141 detects a proximity touch and a proximity touch pattern (for example, a proximity touch distance, a proximity touch direction, a proximity touch speed, a proximity touch time, a proximity touch position, and a proximity touch movement state). do. Information corresponding to the sensed proximity touch operation and proximity touch pattern may be output on the touch screen.

The sound output module 152 outputs audio data received from the wireless communication unit 110 or stored in the memory 160 in a call signal reception, a call mode or a recording mode, a voice recognition mode, a broadcast reception mode, and the like. In addition, the sound output module 152 outputs a sound signal related to a function (eg, a call signal reception sound, a message reception sound, etc.) performed in the mobile terminal 100. The sound output module 152 may include a receiver, a speaker, a buzzer, and the like.

The alarm unit 153 outputs a signal for notifying occurrence of an event of the mobile terminal 100. Examples of events generated in the mobile terminal 100 include call signal reception, message reception, key signal input, and touch input. The alarm unit 153 may output a signal for notifying an occurrence of an event in a form other than an audio signal or a video signal, for example, vibration. When a call signal is received or a message is received, the alarm unit 153 may vibrate the mobile terminal 100 through a vibration means. Alternatively, when the key signal is input, the alarm unit 153 may vibrate the mobile terminal 100 through the vibration means in response to the key signal input. The user can recognize the occurrence of the event through the vibration as described above. Of course, the signal for event occurrence notification may be output through the display unit 151 or the voice output module 152, so that they 151, 152 may be classified as part of the alarm unit 153.

The haptic module 154 generates various tactile effects that a user can feel. Vibration is a representative example of the haptic effect generated by the haptic module 154. The intensity and pattern of vibration generated by the haptic module 154 can be controlled. For example, different vibrations may be synthesized and output or may be sequentially output.

The haptic module 154, in addition to the vibration, the pin arrangement to move vertically with respect to the contact skin surface, the blowing force or suction force of the air through the injection or inlet, the grazing to the skin surface, the contact of the electrode (eletrode), the stimulation such as electrostatic force Various tactile effects can be produced, such as the effects of the heat-absorbing effect and the effect of reproducing a sense of cold using the elements capable of absorbing heat or heat.

The haptic module 154 may not only deliver the haptic effect through direct contact, but may also be implemented to allow the user to feel the haptic effect through a muscle sense such as a finger or an arm. Two or more haptic modules 154 may be provided according to the configuration of the portable terminal 100.

The memory 160 may store a program for processing and controlling the controller 180, and a function for temporarily storing input / output data (for example, a phone book, a message, a still image, a video, etc.). You can also do The memory 160 may store data regarding vibrations and sounds of various patterns output when a touch input on the touch screen is performed.

The memory 160 may be a flash memory type, a hard disk type, a multimedia card micro type, a card type memory (eg, SD or XD memory). Random Access Memory (RAM) Static Random Access Memory (SRAM), Read-Only Memory (ROM), Electrically Erasable Programmable Read-Only Memory (EEPROM), Programmable Read-Only Memory (PROM) magnetic memory, Magnetic It may include a storage medium of at least one type of disk, optical disk. In addition, the mobile terminal 100 may operate a web storage that performs a storage function of the memory 150 on the Internet, or may operate in connection with the web storage.

The interface unit 170 serves as an interface with all external devices connected to the mobile terminal 100. For example, the interface unit 170 may include a wired / wireless headset port, an external charger port, a wired / wireless data port, a memory card port, a port for connecting a device equipped with an identification module, An audio input / output (I / O) port, a video input / output (I / O) port, and an earphone port may be configured. Here, the identification module is a chip that stores various information for authenticating the use right of the mobile terminal 100, and includes a user identification module (UIM), a subscriber identify module (SIM), and universal user authentication. It may include a module (Universal Subscriber Identity Module (USIM)). In addition, the device equipped with the identification module (hereinafter referred to as 'identification device'), may be produced in the form of a smart card (Smart Card). Therefore, the identification module may be connected to the mobile terminal 100 through a port. The interface unit 170 may receive data from an external device or receive power and transmit the data to each component inside the mobile terminal 100 or transmit data within the mobile terminal 100 to an external device.

The interface unit 170 may be a passage through which power from the cradle is supplied to the mobile terminal 100 when the mobile terminal 100 is connected to an external cradle, or various commands input by the user from the cradle. It may be a passage through which a signal is transmitted to the mobile terminal 100. Various command signals or power input from the cradle may be operated as signals for recognizing that the mobile terminal 100 is correctly mounted on the cradle.

The controller 180 typically controls the overall operation of the mobile terminal 100. For example, the controller 180 performs related control and processing for voice call, data communication, video call, and the like. In addition, the controller 180 may include a multimedia module 181 for multimedia playback. The multimedia module 181 may be implemented in the controller 180 or may be implemented separately from the controller 180.

The controller 180 may perform a pattern recognition process for recognizing a writing input or a drawing input performed on a touch screen as text and an image, respectively.

The power supply unit 190 receives an external power source and an internal power source under the control of the controller 180 to supply power for operation of each component.

The functions of the components applied to the mobile terminal 100 may be implemented in a computer-readable recording medium using software, hardware or a combination thereof. Hardware implementations include Application Specific Integrated Circuits (ASICs), Digital Signal Processors (DSPs), Digital Signal Processing Devices (DSPDs), Programmable Logic Devices (PLDs), Field Programmable Gate Arrays (FPGAs), Processors, Controllers The controller may be implemented using at least one of controllers, micro-controllers, microprocessors, and electrical units for performing functions. In some cases such embodiments may be implemented by the controller 180. In a software implementation, embodiments such as procedures or functions may be implemented with separate software modules that allow at least one function or operation to be performed. The software code may be implemented by a software application written in a suitable programming language. In addition, the software code may be stored in the memory 160 and executed by the controller 180. In addition, the navigation session 300 applied to the mobile terminal 100 provides a general navigation function.

On the other hand, the controller 180 applied to the mobile terminal 100 according to an embodiment of the present invention, the number of confirmed (or sensed) points, the number of edges, the number of rotation and polygons, the number and vibration of the terminal, Based on the number of edges according to the movement of the user, the mobile terminal displays a pre-stored menu screen corresponding to the number of points, the number of edges, the direction of rotation and the number of polygons, the frequency and direction of the terminal, and the number of edges according to the movement of the user. Output to 100 or execute a pre-stored command.

2 is a block diagram showing the configuration of a telematics terminal 200 for explaining a command processing apparatus according to an embodiment of the present invention. As shown in FIG. 2, a central processing apparatus for controlling the telematics terminal 200 as a whole is provided. Processing Unit (CPU) 222, a main board having a built-in key control unit 221 for controlling various key signals, an LCD control unit 223 for controlling an LCD, and a memory 224 for storing various types of information. 220).

The memory 224 stores map information (map data) for displaying vehicle guidance information (road guidance information for guiding a user during driving / non-driving) on a map of the display unit (or LCD) 211.

In addition, the memory 224 stores a traffic information collection control algorithm for inputting traffic information according to a road condition on which the vehicle is currently traveling and various information for controlling the algorithm.

The main board 220 is a communication module 201 that is assigned a unique device number and performs voice calls and data transmission and reception through a mobile communication terminal embedded in a vehicle, and guides the location of the vehicle and travels from the starting point to the destination. A GPS module 202 for receiving a GPS signal for path tracking and the like, generating current location data of the vehicle based on the received GPS signal, or transmitting traffic information collected by a user as a GPS signal; It includes a gyro sensor (203) for detecting the driving direction, a CD deck (204) for reproducing the signal recorded on the compact disk (CD).

In addition, the communication module 201 and the GPS module 202 transmit / receive signals through the first antenna 205 and the second antenna 206, respectively.

In addition, the main board 220 is connected to the TV module 230 that receives the broadcast signal through the broadcast signal antenna (or TV antenna) 231.

In addition, the main board 220 is connected to a liquid crystal display (LCD) 211 under the control of the LCD controller 223 through an interface board 213.

The LCD 211 performs a predetermined signal processing process on the broadcast signal received through the TV module 230, and then controls the signal processed broadcast signal by the LCD controller 223 on the interface board ( A video signal is displayed on the LCD 211 through 213, and the audio signal is output through the amplifier 254 under the control of the audio board 240 to be described later. In addition, the LCD 211 displays various video signals and text signals based on the control signals of the LCD control unit 223.

In addition, the LCD 211 may be configured to receive an input from a user by using a touch screen method.

In addition, the main board 220 is connected to the front board 212 under the control of the key controller 221 through the interface board 213. The front board 212 configures buttons and menus for inputting various key signals, and provides the main board 220 with a key signal corresponding to a key (or button) selected by a user. In addition, the front board 212 may include a menu key for directly inputting traffic information, and the menu key may be configured to be controlled by the key controller 221.

The audio board 240 is connected to the main board 220 and processes various audio signals. The audio board 240 includes a microcomputer 244 for controlling the audio board 240, a tuner 243 for receiving a radio signal through an antenna (or radio antenna) 245, And a power processor 242 for supplying power to the microcomputer 244, and a signal processor 241 for performing signal processing for outputting various voice signals.

In addition, the audio board 240 is connected to a radio antenna 245 for receiving a radio signal and a tape deck 246 for playing an audio tape.

Also, the audio board 240 is connected to an amplifier 254 for outputting a voice signal processed by the audio board 240.

In addition, the amplifier 254 is connected to the vehicle interface 250. That is, the main board 220 and the audio board 240 are connected to the vehicle interface 250, respectively. In addition, the vehicle interface 250 may include a hands-free 251 for inputting a voice signal without using a vehicle driver's hand, an airbag 252 for providing passenger safety, and a speed sensor for detecting a vehicle speed. 253 and the like can be connected.

In addition, the speed sensor 253 calculates a vehicle speed and provides the calculated vehicle speed information to the central processing unit 222.

In addition, the navigation session 300 applied to the telematics terminal 200 provides a general navigation function.

On the other hand, the central processing unit 222 applied to the telematics terminal 200 according to an embodiment of the present invention, the number of confirmed (or sensed) points, the number of edges, the direction of rotation and polygons, the number of vibrations of the terminal and The pre-stored menu screen corresponding to the number of points, the number of edges, the number of rotations and polygons, the number of vibrations and the direction of the terminal, and the number of edges according to the movement of the user is displayed based on the direction, the number of edges according to the movement of the user, and the like. It outputs to the telematics terminal 200 or executes a pre-stored command.

Hereinafter, it will be described in detail with reference to FIG. 3 on the assumption that the command processing apparatus 400 according to the embodiment of the present invention is applied to the mobile terminal 100. Here, the command processing apparatus according to the embodiment of the present invention may be applied to the telematics terminal 200 as well as the mobile terminal 100.

3 is a block diagram illustrating a configuration of an instruction processing apparatus according to an exemplary embodiment of the present invention. As shown in the drawing, the instruction processing apparatus 400 includes an input unit 401, a sensor unit 402, and a storage unit 403. ), The display unit 404, the audio output unit 405, the control unit 406, and the communication unit 407.

The input unit 401 receives a command or a control signal by an operation such as receiving a button operation by a user or touching / scrolling a displayed screen.

In addition, the input unit 401 may select a function desired by the user or receive information, and various devices such as a keypad, a touch screen, a jog shuttle, and a microphone may be used.

In addition, the input unit 401 may include a camera, and may detect (or detect) the movement of the user's face or the movement of the pupil through the camera.

The sensor unit 402 detects a movement of the command processing device 400 and includes a motion recognition sensor. The motion recognition sensor may include a sensor for recognizing a motion or a position of an object, a geomagnetism sensor, an acceleration sensor, a gyro sensor, and an inertial sensor. Sensors, such as an altimeter and a vibration sensor, may further include sensors related to motion recognition.

In addition, the sensor unit 402, the movement of the command processing device 400, for example, the inclination direction, inclination angle and / or the inclination speed, up / down / left and right of the command processing device 400 It detects information including the direction of vibration and / or the number of vibrations in a direction such as a diagonal. Here, the detected information (the tilted direction, the tilted angle and / or the tilted speed, the vibration direction and / or the number of vibrations) is digitized through a digital signal processing process, so that the digitized information is stored in the controller ( 406.

The storage unit 403 stores various user interfaces (UIs) and / or graphical user interfaces (GUIs).

In addition, the storage unit 403 stores various menu screens or commands corresponding to the commands input to the input unit 401. Here, the input command may include a number of points (or touches), an edge number for a line, an edge number according to a movement of a hand (or finger), an edge number according to movement of a face, or a pupil, and the like. It includes.

In addition, the storage unit 403 stores information (a tilted direction, an inclined angle and / or the inclined speed, a vibration direction and / or a vibration frequency) sensed by the sensor unit 402. .

The display unit 404 displays various information according to the control result of the control unit 406. The display unit 406 may be a touch screen.

In addition, when displaying the various information, the display unit 404 may display various contents such as various menu screens by using the user interface and / or the graphical user interface stored in the storage unit 403. Here, the content displayed on the display unit 404 includes a menu screen including various text or image data (including map data and various information data) and data such as icons, list menus, combo boxes, and the like.

The voice output unit 405 converts and outputs various information according to the control result of the control unit 406 into voice. Here, the voice output unit 405 may be a speaker.

When the display unit 404 is a touch screen, the controller 406 detects the number of points (or touches) and / or the number of edges for the line, and the detected number of points and / or Alternatively, the menu screen pre-stored in the storage unit 403 is output to the display unit 404 and / or the voice output unit 405 corresponding to the number of edges of the line.

In addition, the control unit 406, when there is no function previously stored in the storage unit 403 corresponding to the identified number of points and / or the number of edges for the line, the display unit 406 and / or the voice The output unit 407 may be configured to inform the user that there is no pre-stored function according to the identified number of points and / or number of edges for a line.

In addition, when the display unit 404 is a touch screen, the controller 406 detects the number of points (or touches) input by the user and / or the number of edges for the line, and the detected number of points. And / or execute a corresponding command previously stored in the storage unit 403 corresponding to the number of edges for the line. In addition, the controller 406 may output the execution result of the corresponding command to the display unit 404 and / or the voice output unit 405.

In addition, the control unit 406 checks the number of edges or lines according to the movement of the hand or finger detected by the input unit 401, and determines the number of edges or lines according to the movement of the identified hand or finger. Correspondingly, the menu screen pre-stored in the storage unit 403 is output to the display unit 404 and / or the voice output unit 405, or the corresponding command previously stored in the storage unit 403 is executed. The execution result is output to the display unit 404 and / or the audio output unit 405.

In addition, the controller 406 may display the menu screen pre-stored in the storage unit 403 in response to the frequency and / or vibration direction detected by the sensor unit 402. It outputs to the voice output unit 405 or executes a corresponding command stored in the storage unit 403 and outputs the execution result to the display unit 404 and / or the voice output unit 405.

In addition, the controller 406 checks the number of edges or lines according to the movement of the user's face or the movement of the pupil of the user detected by the camera of the input unit 401, and determines the number of the identified user's face Outputs a corresponding menu screen pre-stored in the storage unit 403 to the display unit 404 and / or the voice output unit 405 in response to the number of edges or lines according to the movement or movement of the user's eyes; The command is pre-stored in the storage unit 403, and the execution result is output to the display unit 404 and / or the voice output unit 405.

In addition, the controller 406 may transmit a corresponding menu screen and / or a corresponding command previously stored in the storage unit 403 in response to the detected number of points and / or the number of edges of the line through the communication unit 407. It can also be sent to an external terminal.

The communication unit 407 may include a wireless internet module or a short range communication module. The wireless Internet module may include WLAN, Wi-Fi, WiBro, WiMAX, HSDPA, and the like, and the short-range communication module may include Bluetooth, RFID, infrared communication, UWB, ZigBee, and the like.

In addition, the input unit 401, the sensor unit 402, the storage unit 403, the display unit 404, the voice output unit 405, and the control unit included in the command processing apparatus 400 described above with reference to FIG. 3 of the present invention. 406 and some (or all) of the configuration of the communication unit 407 may be implemented by components having similar functions in the mobile terminal 100.

That is, the input unit 401 may be the user input unit 130 of the mobile terminal 100, and the sensor unit 402 may be the sensing unit 140 of the mobile terminal 100. The storage unit 403 may be a memory 160 of the mobile terminal 100, the display unit 404 may be a display unit 151 of the mobile terminal 100, and the voice output unit. 405 may be a sound output module 152 of the mobile terminal 100, the control unit 406 may be a control unit 180 of the mobile terminal 100, the communication unit 407 The wireless communication unit 110 of the mobile terminal 100 may be.

In addition, the input unit 401, the sensor unit 402, the storage unit 403, the display unit 404, the voice output unit 405, and the control unit included in the command processing apparatus 400 described above with reference to FIG. 3 of the present invention. 406 and some (or all) of the configuration of the communication unit 407 may be implemented by components having similar functions in the telematics terminal 200.

That is, the input unit 401 may be the LCD 211 of the telematics terminal 200, and the sensor unit 402 may be the gyro sensor 203 of the telematics terminal 200. The unit 403 may be a memory 224 of the telematics terminal 200, the display unit 404 may be an LCD 211 of the telematics terminal 200, and the voice output unit 405. May be an amplifier 254 of the telematics terminal 200, the control unit 406 may be a central processing unit 222 of the telematics terminal 200, and the communication unit 407 may be the telematics terminal. It may be a communication module 201 of (200).

Hereinafter, a command processing method according to the present invention will be described in detail with reference to FIGS. 1 to 10.

4 is a flowchart illustrating a command processing method according to a first embodiment of the present invention.

First, the control unit 406 checks (detects) the number of edges input from the user via the display unit 404. Here, the display unit 404 includes a touch screen that detects a touch operation by a touch sensor including a touch film, a touch sheet, a touch pad, and the like. In addition, the number of edges input from the user may be checked through the input unit 401 including a touch screen instead of the display unit 404.

For example, as illustrated in FIGS. 5A to 5C, when the user inputs a line using the user's finger, a stylus pen, a touch pen, or the like, on the display unit 404 The controller 406 checks the number of edges of the input line.

5A corresponds to cases in which the number of edges is “0” and a point is input once or a line (including a straight line or a curve) is input.

5B shows cases (a-1, a-2, a-3) in which the number of edges is "1", and FIG. 5C shows cases (b-11) in which the number of edges is "2". b-12, b-21 and b-22, b-31 and b-32.

In addition, the controller 406 may detect (confirm) an edge of a line input in a round trip up, down, left, right, or diagonally at the same place on the display unit 404 (S110).

Thereafter, the controller 406 performs a function previously stored in the storage unit 403 in response to the identified number of edges.

That is, the controller 406 controls to display the menu screen on the display unit 404 when the function previously stored in the storage unit 403 is a specific menu screen in response to the identified number of edges.

In addition, the controller 406 executes a corresponding command when a function previously stored in the storage unit 403 is a specific command corresponding to the identified number of edges. In addition, the controller 406 may control to output the execution result of the command through the display unit 404 and / or the voice output unit 405.

For example, in a state in which the corresponding menu screen or command function according to the number of edges as shown in Table 1 is pre-stored in the storage unit 403, when the number of the checked edges is 3, the control unit 406 ) Controls the display unit 404 to output a "phonebook search menu screen" previously stored in the storage unit 403 corresponding to the identified number of edges (for example, "3").

Edge number function 0 "Help" output of active window One Run the active "Close current window" function 2 Run the "electronic dictionary" program 3 Output "Phonebook Search Menu Screen" ... ...

Here, the function setting for the corresponding menu screen or command according to the number of edges pre-stored in the storage unit 403 may be clicked on the "edge input setting" button (not shown) in the user menu setting screen (not shown). It may be set using a user interface provided or a graphical user interface (not shown). In addition, the function setting according to the number of edges may be set for a menu screen or a command that a user prefers (or frequently uses) (S120).

6 is a flowchart illustrating a command processing method according to a second embodiment of the present invention.

First, the control unit 406 checks (detects) the number of points input from the user via the display unit 404. Here, the display unit 404 includes a touch screen that detects a touch operation by a touch sensor including a touch film, a touch sheet, a touch pad, and the like. In addition, the number of points input from the user may be checked through the input unit 401 including a touch screen instead of the display unit 404.

In addition, the number of input points may be checked by the number of times input within a preset time (for example, 3 seconds) (S210).

Thereafter, the controller 406 performs a function previously stored in the storage unit 403 in response to the identified number of points.

That is, the controller 406 controls to display the menu screen on the display unit 404 when a function previously stored in the storage unit 403 is a specific menu screen corresponding to the identified number of points.

In addition, the controller 406 executes a corresponding command when a function previously stored in the storage unit 403 is a specific command corresponding to the identified number of points. In addition, the controller 406 may control to output the execution result of the command through the display unit 404 and / or the voice output unit 405.

For example, in a state in which the corresponding menu screen or command function according to the number of points is pre-stored in the storage unit 403, when the number of the identified points is 2, the controller 406 may determine the identified points. In response to the number of times, a function of "calling a preset home telephone number (for example, 02-1234-5678)" previously stored in the storage unit 403 is executed (S220).

7 is a flowchart illustrating a command processing method according to a third embodiment of the present invention.

First, the control unit 406 checks (detects) the number of times of input of a circle, an ellipse, or a polygon input from the user through the display unit 404. Here, the display unit 404 includes a touch screen that detects a touch operation by a touch sensor including a touch film, a touch sheet, a touch pad, and the like. In addition, the input unit 401 including a touch screen instead of the display unit 404 may check the number of times of input of a circle, an ellipse or a polygon input from the user.

In addition, the input number of the input circle, ellipse or polygon may be confirmed as the number of times input within a preset time (for example, 5 seconds).

In addition, the input direction of the input circle, ellipse or polygon may also be checked with the input circle, ellipse or polygon, such as whether it is input in a clockwise or counterclockwise direction.

For example, when the user inputs two circles in the clockwise direction using the user's finger, a stylus pen, a touch pen, etc., on the display unit 404, the controller 406 may be configured with respect to the input circle. The input direction and the number of inputs are checked (for example, two inputs in a clockwise direction) (S310).

Thereafter, the controller 406 performs a function previously stored in the storage unit 403 in response to the identified number of times of input of the circle, ellipse or polygon and / or the input direction.

That is, the controller 406, when a function previously stored in the storage unit 403 is a specific menu screen corresponding to the input number and / or input direction of the identified circle, ellipse or polygon, displays the menu screen. Control to display on the display unit 404.

In addition, the controller 406 executes a corresponding command when a function previously stored in the storage unit 403 corresponds to a specific number of inputs and / or input directions of the identified circle, ellipse, or polygon. In addition, the controller 406 may control to output the execution result of the command through the display unit 404 and / or the voice output unit 405.

As an example, when inputted twice in the clockwise direction, the controller 406 controls to execute a "screen protection function setting" command previously stored in the storage unit 403 according to the direction and the number of times of input (S320).

8 is a flowchart illustrating a command processing method according to a fourth embodiment of the present invention.

First, the control unit 406 checks the number of vibrations in the up / down, left / right, or diagonal directions of the sensor unit 402 input through the sensor unit 402. The sensor unit 402 may be a motion recognition sensor. In addition to the number of vibrations, the vibration directions (up, down, left, right, diagonal directions, etc.) can be checked (detected) together.

In addition, the input frequency can be checked by the number of times input within a predetermined time (for example, 3 seconds) (S410).

Thereafter, the controller 406 performs a function previously stored in the storage unit 403 in response to the identified frequency.

That is, the controller 406 controls to display the menu screen on the display unit 404 when a function previously stored in the storage unit 403 is a specific menu screen in response to the identified frequency.

In addition, the controller 406 executes a corresponding command when a function previously stored in the storage unit 403 is a specific command in response to the identified frequency. In addition, the controller 406 may control to output the execution result of the command through the display unit 404 and / or the voice output unit 405 (S420).

9 is a flowchart illustrating a command processing method according to a fifth embodiment of the present invention.

First, the controller 406 detects a user's movement in an image input through the input unit 401 including a camera. Herein, the movement of the user may include movement of a focused user's face (eg, movement in a direction such as up, down, left, right, and diagonal lines), movement of the focused pupil, movement of the focused hand, and movement of the focused finger. It may be to detect the movement of the body part of the user focused by the camera, and the like, and the method of detecting the movement of the user may be implemented through various application programs.

In addition, the detected movement of the user may be confirmed as the movement of the user detected within a predetermined time (S510).

Thereafter, the controller 406 checks the number of edges in the detected user movement.

For example, as shown in FIG. 10, the controller 406 checks the number of edges among the movements of the user's finger focused by the camera. In this case, in the case of FIG. 10, when the number of edges according to the movement of the user's finger is "2" (701, 702), the controller 406 determines the number of edges according to the movement of the user's finger. It is confirmed as "2" (S520).

Thereafter, the controller 406 performs a function previously stored in the storage unit 403 in response to the identified number of edges.

That is, the controller 406 controls to display the menu screen on the display unit 404 when the function previously stored in the storage unit 403 is a specific menu screen in response to the identified number of edges.

In addition, the controller 406 executes a corresponding command when a function previously stored in the storage unit 403 is a specific command corresponding to the identified number of edges. In addition, the controller 406 may control to output the execution result of the command through the display unit 404 and / or the voice output unit 405 (S530).

In this way, a command is input by an easy, convenient and simple operation, and the pre-stored function may be performed on the input command.

As described above, in the first to fifth embodiments, the number of edges, the number of points, the input direction and / or the number of polygons, the number of vibrations and / or vibrations of the terminal, and the number of edges according to the movement of the user confirmed through the camera are described. Although each has been described, the above contents may be implemented in combination with each other.

That is, for example, when the number of edges and the number of points are combined, when the user inputs "V ... N" continuously (or within a predetermined time) on the display unit 404, the control unit 406, "V" is "edge one time", "..." is "point two times", "N" is "edge two times" with respect to the said continuously input "V ... N", and "edge 1", respectively. It may be configured to perform a function such as output of a menu or execution of a corresponding command previously stored in the storage unit 403 corresponding to "time, point 2, edge 2".

It will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the spirit and scope of the invention as defined by the appended claims. Therefore, the embodiments disclosed in the present invention are not intended to limit the technical idea of the present invention but to describe the present invention, and the scope of the technical idea of the present invention is not limited by these embodiments. The protection scope of the present invention should be interpreted by the following claims, and all technical ideas within the equivalent scope should be interpreted as being included in the scope of the present invention.

1 is a block diagram showing a configuration of a mobile terminal for explaining an instruction processing apparatus according to an embodiment of the present invention.

2 is a block diagram showing the configuration of a telematics terminal for explaining an instruction processing apparatus according to an embodiment of the present invention.

3 is a block diagram showing the configuration of an instruction processing apparatus according to an embodiment of the present invention.

4 is a flowchart illustrating a command processing method according to a first embodiment of the present invention.

5A to 5C illustrate examples of edges according to an embodiment of the present invention.

6 is a flowchart illustrating a command processing method according to a second embodiment of the present invention.

7 is a flowchart illustrating a command processing method according to a third embodiment of the present invention.

8 is a flowchart illustrating a command processing method according to a fourth embodiment of the present invention.

9 is a flowchart illustrating a command processing method according to a fifth embodiment of the present invention.

10 is a view showing an example of the number of edges according to the user's finger movement in accordance with an embodiment of the present invention.

Claims (11)

Checking the number of edges or number of points from the input contents; And And performing a pre-stored function corresponding to the identified number of edges or number of points. The method of claim 1, wherein the performing of the pre-stored function comprises: And a menu screen stored in advance corresponding to the identified number of edges or points. The method of claim 1, wherein the performing of the pre-stored function comprises: And executing a pre-stored command corresponding to the identified number of edges or points. Checking the vibration frequency and the vibration direction of the terminal; And And performing a pre-stored function corresponding to the identified frequency and direction of vibration. The method of claim 4, wherein the performing of the pre-stored function comprises: And outputting a pre-stored menu screen or executing a pre-stored command in response to the identified number of vibrations and the direction of vibration. Detecting movement of a user input through a camera; Checking the number of edges according to the detected movement of the user; And And performing a pre-stored function corresponding to the identified number of edges. The method of claim 6, wherein the movement of the user, Command movement method, characterized in that any one of the movement of the user's face, the movement of the eyes, the movement of the hand and the movement of the finger. The method of claim 6, wherein the performing of the pre-stored function comprises: And outputting a pre-stored menu screen or executing a pre-stored command corresponding to the identified number of edges. A display unit which receives contents by a user's operation; And And a controller configured to check the number of edges in the input content and to perform a pre-stored function corresponding to the identified number of edges. The method of claim 9, wherein the control unit, And a menu screen pre-stored in a storage unit in response to the identified number of edges. The method of claim 9, wherein the control unit, And executing a command previously stored in a storage unit in response to the identified number of edges, and displaying the command execution result on the display unit.
KR1020090026048A 2009-03-26 2009-03-26 Apparatus for processing command and method thereof KR20100107787A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020090026048A KR20100107787A (en) 2009-03-26 2009-03-26 Apparatus for processing command and method thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020090026048A KR20100107787A (en) 2009-03-26 2009-03-26 Apparatus for processing command and method thereof

Publications (1)

Publication Number Publication Date
KR20100107787A true KR20100107787A (en) 2010-10-06

Family

ID=43129442

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020090026048A KR20100107787A (en) 2009-03-26 2009-03-26 Apparatus for processing command and method thereof

Country Status (1)

Country Link
KR (1) KR20100107787A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101853486B1 (en) 2017-09-15 2018-06-14 (주)휴맥스옵틱 A hinge unit for leg of glasses

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101853486B1 (en) 2017-09-15 2018-06-14 (주)휴맥스옵틱 A hinge unit for leg of glasses

Similar Documents

Publication Publication Date Title
KR101597553B1 (en) Function execution method and apparatus thereof
US9292167B2 (en) Content control apparatus and method thereof
KR101495190B1 (en) Image display device and operation method of the image display device
KR20110004708A (en) Mobile terminal and control method thereof
KR20100124591A (en) Mobile terminal system and control method thereof
KR20110020082A (en) Apparatus for controlling mobile terminal and method thereof
KR20120079379A (en) Information displaying apparatus and method thereof
KR101698087B1 (en) Mobile terminal and control method thereof
KR20120005324A (en) Electronic device controlling apparatus for mobile terminal and method thereof
KR20110004706A (en) Emergency handling apparatus for mobile terminal and method thereof
KR20120002252A (en) Mobile terminal system and control method thereof
KR20100062707A (en) Method for displaying information for mobile terminal and apparatus thereof
KR20120066511A (en) Video processing apparatus of mobile terminal and method thereof
KR101641250B1 (en) Mobile terminal and control method thereof
KR20120035293A (en) Mobile terminal
KR101748665B1 (en) Information displaying apparatus and method thereof
KR20100107787A (en) Apparatus for processing command and method thereof
KR101516638B1 (en) Navigation apparatus and method thereof
KR101578731B1 (en) Mobile terminal and control method thereof
KR20100135117A (en) Signal processing apparatus and method thereof
KR20110010001A (en) Apparatus for preventing the loss of a terminal and method thereof
KR20120034515A (en) Data display apparatus and method thereof
KR20150133051A (en) Mobile communication terminal and control method thereof
KR101785918B1 (en) Electronic device and control method for electronic device
KR101690594B1 (en) Electronic device and control method for electronic device

Legal Events

Date Code Title Description
A201 Request for examination
AMND Amendment
E601 Decision to refuse application
AMND Amendment
J201 Request for trial against refusal decision
E801 Decision on dismissal of amendment
B601 Maintenance of original decision after re-examination before a trial
J121 Written withdrawal of request for trial