KR20140038681A - Input device and controlling method there of - Google Patents

Input device and controlling method there of Download PDF

Info

Publication number
KR20140038681A
KR20140038681A KR1020120105073A KR20120105073A KR20140038681A KR 20140038681 A KR20140038681 A KR 20140038681A KR 1020120105073 A KR1020120105073 A KR 1020120105073A KR 20120105073 A KR20120105073 A KR 20120105073A KR 20140038681 A KR20140038681 A KR 20140038681A
Authority
KR
South Korea
Prior art keywords
input device
scan
unit
image
rendering
Prior art date
Application number
KR1020120105073A
Other languages
Korean (ko)
Inventor
김민우
이정용
Original Assignee
엘지전자 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 엘지전자 주식회사 filed Critical 엘지전자 주식회사
Priority to KR1020120105073A priority Critical patent/KR20140038681A/en
Priority to US13/829,267 priority patent/US9113053B2/en
Priority to JP2013054261A priority patent/JP5827259B2/en
Priority to EP13001342.8A priority patent/EP2696566A1/en
Publication of KR20140038681A publication Critical patent/KR20140038681A/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/02Input arrangements using manually operated switches, e.g. using keyboards or dials
    • G06F3/0202Constructional details or processes of manufacture of the input device
    • G06F3/021Arrangements integrating additional peripherals in a keyboard, e.g. card or barcode reader, optical scanner
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/005General purpose rendering architectures
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/00795Reading arrangements
    • H04N1/00798Circuits or arrangements for the control thereof, e.g. using a programmed control device or according to a measured quantity
    • H04N1/00822Selecting or setting a particular reading mode, e.g. from amongst a plurality of modes, simplex or duplex, or high or low resolution
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/25Management operations performed by the server for facilitating the content distribution or administrating data related to end-users or client devices, e.g. end-user or client device authentication, learning user preferences for recommending movies
    • H04N21/258Client or end-user data management, e.g. managing client capabilities, user preferences or demographics, processing of multiple end-users preferences to derive collaborative data
    • H04N21/25808Management of client data
    • H04N21/25825Management of client data involving client display capabilities, e.g. screen resolution of a mobile phone
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/0304Detection arrangements using opto-electronic means

Abstract

An input device with a scan feature according to an embodiment of the present invention includes: a unit image acquisition unit which obtains unit images of a scan object; a specification verification unit which verifies the specification of the terminal performing communication with the input device; a rendering selection unit which selects either the hardware rendering or the software rendering according to the verified specification; and an object image acquisition unit which obtains an image of the scan object by merging the obtained unit images according to the selected rendering. [Reference numerals] (S4101) Mouse mode; (S4103) Receive a scan start request signal?; (S4105) Operate in a scan mode; (S4107) Detect the coordinates of a frame; (S4109) Store the detected coordinate; (S4111) Obtain a unit image of the frame based on the detected coordinate; (S4113) Check hardware specifications; (S4115) Over standard specifications?; (S4117) Select hardware rendering; (S4119) Obtain an object image through hardware rendering; (S4121) Select software rendering; (S4123) Obtain an object image through software rendering; (S4125) Check a hardware use status; (S4127) Over a standard use state?; (S4131) Check the use state of a central processing unit; (S4133) Use rate of the central processing unit is over a standard use rate?

Description

Input device and its control method {INPUT DEVICE AND CONTROLLING METHOD THERE OF}

The present invention relates to an input device and a method of controlling the same, and more particularly, an image of a scan target object may be selectively applied according to a specification of a terminal connected to the input device, a use state of the input device, and a use state of the terminal. An input device obtainable and a control method thereof.

In general, a multifunction peripheral or the like having a scanning function reads document data from a page of a document to be scanned, and prints the read document data or transmits the read document data to the outside using a communication device such as a modem. Therefore, a multifunction peripheral having a conventional scan function is often insufficient in scanning a document or the like having a specific size or more. In addition, a multifunction peripheral having a conventional scan function has a problem in that it has to be moved to a multifunction peripheral in which a scan target is fixedly installed to scan a simple image such as a business card photograph.

On the other hand, with the development of digital technology, input devices such as a mouse have various additional functions in addition to the original functions, and consumers can meet various needs with a single device by utilizing additional functions.

However, the conventional input device equipped with a scan function has a problem in that the processing speed of the image through the graphics card of the terminal connected to the input device, the scan processing speed, the scanable range and the like.

An object of the present invention is to provide a method of stably obtaining an image of a scan target object by selecting a rendering method according to a specification of a terminal connected to an input device.

The present invention provides a method of improving an image merging processing speed of a scan target object and expanding a scannable range by applying a rendering method in consideration of a use state of an input device and a use state of a terminal connected to the input device. The purpose.

An input device having a scan function according to an embodiment of the present invention includes a unit image acquisition unit for acquiring unit images of a scan target object, a specification confirmation unit for confirming a specification of a terminal communicating with the input device, and the confirmation. And a rendering selection unit for selecting either hardware rendering or software rendering according to the specified specification, and an object image obtaining unit for merging the obtained unit images according to the selected rendering to obtain an image of the scan target object.

According to another aspect of the present invention, there is provided a method of controlling an input device having a scan function, including obtaining unit images of a scan target object, checking a specification of a terminal communicating with the input device, and confirming Selecting either hardware rendering or software rendering according to a specification, and merging the obtained unit images according to the selected rendering to obtain an image of the scan target object.

The control method of the input device may be implemented as a program for executing in a computer and implemented as a computer-readable recording medium.

According to an embodiment of the present disclosure, an image of a scan target object may be stably obtained by selecting a rendering method according to a specification of a terminal connected to an input device.

The present invention can improve the image merge processing speed of the scan target object and enlarge the scanable range by applying a rendering method in consideration of the use state of the input device and the use state of the terminal connected to the input device.

Meanwhile, various other effects will be directly or implicitly disclosed in the detailed description according to the embodiment of the present invention to be described later.

1 is a view showing a scan image forming system according to an embodiment of the present invention.
2 is a plan view and a rear view of an input device having a scan function according to an exemplary embodiment of the present invention.
3 is a flowchart illustrating a method of displaying a scanned image according to a first embodiment of the present invention.
4 is a flowchart illustrating a method of displaying a scanned image according to a second embodiment of the present invention.
5 is a diagram illustrating a screen providing a scan UI window in which a scan image is displayed according to a first embodiment of the present invention.
6 is a diagram illustrating a screen that provides a scan UI window for acquiring a scan image and displaying a scan image and a memory guide bar according to a second embodiment of the present invention.
7 is a flowchart illustrating a method of displaying a scanned image according to a third embodiment of the present invention.
8 to 12 are diagrams illustrating a screen providing a scan UI window displaying a preview scan area for acquiring a scanned image according to a third exemplary embodiment of the present invention.
13 is a flowchart illustrating a method for displaying a scanned image or an edited image according to a fourth embodiment of the present invention.
14 is a diagram illustrating a state of an input device when a lift off signal is generated according to a fourth embodiment of the present invention.
FIG. 15 is a diagram illustrating a screen providing a scan UI window in which a scan image is displayed according to a fourth embodiment of the present invention.
FIG. 16 is a diagram illustrating a screen providing an edit UI window in which an edit image is displayed according to the fourth embodiment of the present invention.
17 is a flowchart illustrating a method of displaying a scanned image according to a state change signal of an input device according to a fifth embodiment of the present invention.
FIG. 18 is a diagram illustrating a screen providing a scan UI window displaying a scan image according to a state change signal of an input device according to a fifth embodiment of the present invention.
19 is a diagram illustrating a screen providing an edit UI window in which an edit image is displayed according to a state change signal of an input device according to a fifth embodiment of the present invention.
20 is a flowchart illustrating a method of editing a scanned image according to a sixth embodiment of the present invention.
21 to 24 illustrate screens for providing an editing UI window for editing a scan image according to a sixth exemplary embodiment of the present invention.
25 is a block diagram of an input device according to another embodiment of the present invention.
26 is a flowchart illustrating a method of transmitting information of an input device according to an embodiment of the present invention.
27 is a diagram for describing a process of calculating coordinates of a frame.
FIG. 28 is a diagram illustrating a change of a scanned image before and after applying an image correction method according to an exemplary embodiment.
29 is a diagram for describing a situation in which a lift off signal is detected during scanning.
30 is a block diagram of another input device of the present invention.
31 is a flowchart of a method of obtaining a scanned image, according to an exemplary embodiment.
32 is a diagram illustrating the principle of a gyro sensor.
33 is a diagram illustrating a process of obtaining an image of a scan target object by using an input device according to another embodiment of the present invention.
34 is a flowchart illustrating a method of adjusting a frame rate of an input device according to an embodiment of the present invention.
35 is a block diagram of an input device according to another embodiment of the present invention.
36 is a flowchart illustrating a control method of an input device according to an embodiment of the present invention.
37 is a diagram for describing a process of measuring a moving speed of an input device.
38 is a diagram for describing an algorithm code for implementing a control method of an input device according to an embodiment of the present invention.
39 is a flowchart illustrating a control method of an input device according to another embodiment of the present invention.
40 is a block diagram of an input device according to another embodiment of the present invention.
41 is a block diagram of a terminal that can be connected to an input device according to another embodiment of the present invention.
42 is a flowchart illustrating a control method of an input device according to another embodiment of the present invention.
43 is a diagram for explaining a reference specification of the graphic processing unit.
FIG. 44 is a diagram for describing selection of a rendering method according to a utilization rate of a graphic processing unit and a utilization rate of a central processing unit according to a control method of an input device according to another embodiment of the present invention.

Hereinafter, preferred embodiments of the present invention will be described in detail with reference to the accompanying drawings, which will be easily understood by those skilled in the art.

1 is a view showing a scan image forming system according to an embodiment of the present invention.

Referring to FIG. 1, a scan image forming system according to an exemplary embodiment of the present invention includes a display device 100 and an input device 200.

The display device 100 may be a computer, a digital TV, a mobile phone, etc., and may be a device having a display unit.

The display device 100 includes a control unit 110, a display unit 120, a storage unit 130, a scan UI generation unit 140, and a communication interface unit 150.

The control unit 110 can control the overall operation within the display device 100. [ For example, the communication interface unit 150 controls to receive various input signals input from the outside and various data transmitted from the outside, processes the received various input signals and the received data, and outputs processed signals or data Display on the display unit 120, or may be controlled to be stored in the storage unit 130.

The display 120 may generate driving signals by converting various image signals, data signals, OSD signals, etc. processed by the controller 110 into R, G, and B signals, respectively.

For this purpose, the display unit 120 may use a PDP, an LCD, an OLED, a flexible display, a three-dimensional display (3D display) or the like, or a touch screen, It is possible.

Meanwhile, according to the embodiment of the present invention, the display unit 120 may display a scan UI window for displaying a scan image transmitted from the input device 200, which will be described later.

The storage unit 130 may store a program for each signal processing and control in the controller 110, and may store signal-processed video, audio, or data signals.

Also, the storage unit 130 may perform a function for temporarily storing video, audio, or data signals input from the communication interface unit 150. [

According to an embodiment of the present invention, the storage unit 130 may store a scan driver program for controlling the display device 100 to perform a scan operation.

The storage unit 130 may be a flash memory type, a hard disk type, a multimedia card micro type, a card type memory (for example, an SD or XD memory Etc.), RAM, ROM (EEPROM, etc.), and the like.

The scan UI generation unit 140 generates a scan UI window and an edit UI window for displaying the implementation status of the scan driver program on the screen. The generated scan UI window and edit UI window are displayed on the display unit 120, and the user controls the scan operation of the input device 200 through the scan UI window. In addition, various scan control commands are generated by operating various function setting buttons provided in the scan UI window. In addition, various editing control commands are generated by operating various function setting buttons provided in the editing UI window.

The communication interface unit 150 can perform wired / wireless communication with an external device, and can receive various input signals, video, audio, or data signals from an external device.

When the communication interface unit 150 performs wired communication with an external device, the communication interface unit 150 may be configured of serial, PS / 2, USB, etc. When performing wireless communication with an external device, The communication interface unit 150 may be configured of Radio Frequency Identification (RFID), infrared data association (IrDA), Ultra Wideband (UWB), ZigBee, DLNA (Digital Living Network Alliance)

The input device 200 may be a variety of input devices having a scanner function, such as a mouse, a keyboard, a remote control, and the like. In addition, if a scanner function is provided, it may be a mobile terminal such as a mobile phone, a PMP, or a PDA.

The input device 200 includes a control unit 210, a scanning unit 220, a position detection unit 230, a scan function input unit 240, a communication interface unit 250, and a storage unit 260.

The controller 210 may control the overall operation of the input device 200. For example, the scanning image of the scan target object obtained from the scanning unit 220 and the position data obtained from the position detection unit 230, etc. are controlled to be transmitted to an external device through the communication interface 250, The scanned image and the position data may be controlled to be stored in the storage unit 260.

Further, it is possible to control the function input unit 240 to transmit a signal related to various functions input by the user to an external device.

The scanning unit 220 irradiates light to a certain area of the object to be scanned at the same time and detects light reflected therefrom to acquire a scan image corresponding to a certain area of the object to be scanned.

The object to be scanned is a target object containing information to be input / stored by the user in the input device 200, and generally refers to a document in which characters, pictures, and the like are displayed. The predetermined area of the object to be scanned is a two-dimensional area having a predetermined area different from the existing line scan area.

That is, the scanning unit 220 irradiates light to the object to be scanned, and simultaneously irradiates light to a two-dimensional area having a constant area. Some of the irradiated light is reflected from the object to be scanned and input to the scanning unit 210 again, and the scanning unit 220 detects the reflected light and generates scan information. For example, it is possible to generate digital type scan information in which the portion where the reflected light is detected is set to 1 and the portion where the reflected light is not detected is set to 0.

Since the scan information includes information on a light-irradiated portion, that is, a two-dimensional region having a constant area, when a part or all of the scan information is imaged, a scan image corresponding to a certain region of the scan target object can be acquired .

The position detecting unit 230 detects X and Y axis position shifts as the input device 200 moves. The method for detecting the positional movement information and the positional data will be described in more detail. The X and Y coordinates of a specific point are obtained and the reference position data is stored in the storage unit 260. [ Thereafter, if there is movement of the input device 200, the step of obtaining X and Y coordinates of a new point and comparing the new position data with the reference position data stored in the storage unit 260 is repeatedly performed, And detects the movement of the position of the input device 200.

The detected information about the movement of the input device 200 may be transmitted to the display device 100 by matching with the scan image obtained through the scanning unit 220 described above.

The function input unit 240 includes left and right buttons around a scanner function selection button, a wheel button, and a wheel button.

When the user inputs a scanner function selection button, the input device 200 generates a scan start request signal for entering the scan mode and an edit request signal for switching to the edit mode in the scan mode.

In the case of the wheel button and the left / right button, a signal corresponding to the function assigned to each of the scan mode and the edit mode is generated.

The communication interface unit 250 can perform wire / wireless communication with an external device, and can transmit or receive various input signals, video, voice, or data signals from an external device.

When the communication interface unit 150 performs wired communication with an external device, the communication interface unit 150 may be configured of serial, PS / 2, USB, etc. When performing wireless communication with an external device, The communication interface unit 150 may be configured of Radio Frequency Identification (RFID), infrared data association (IrDA), Ultra Wideband (UWB), ZigBee, DLNA (Digital Living Network Alliance)

The storage unit 260 may store the scan image obtained from the scanning unit 220 and the position data and position information obtained from the position detector 230.

The storage unit 260 may be a flash memory type, a hard disk type, a multimedia card micro type, a card type memory (for example, SD or XD Memory, etc.), RAM, ROM (EEPROM, etc.), and the like.

Although not shown in FIG. 1, the input device 200 may further include a predetermined display unit (not shown). In this case, the display unit (not shown) may use a PDP, an LCD, an OLED, a flexible display, a 3D display, or the like, or a touch screen, It is also possible.

In addition, the display unit (not shown) may display a scan image obtained from the scanning unit 220.

2 is a plan view and a rear view of an input device having a scan function according to an exemplary embodiment of the present invention.

2 (a) is a plan view of a mouse, which is one example of the input device 200. The function input unit 240 receives a plurality of functions on the front surface of the mouse 200. The function input unit 240 includes a scanner function selection button 241, a wheel button 242, and left and right buttons 243 and 244 around a wheel button.

Referring to FIG. 2B, the mouse 200 includes an actual scan area 221 for acquiring a scan image from the object to be scanned. The scanning unit 220 may radiate light to the scan target object through the actual scan area 221, and some of the irradiated light may be reflected from the scan target object and input to the scanning unit 210 again.

Hereinafter, a method of displaying scanned images according to the first and second embodiments of the present invention will be described with reference to FIGS. 3 to 6. 3 is a flowchart illustrating a method of displaying a scanned image according to a first embodiment of the present invention. 4 is a flowchart illustrating a method of displaying a scanned image according to a second embodiment of the present invention. 5 is a diagram illustrating a screen providing a scan UI window in which a scan image is displayed according to a first embodiment of the present invention. 6 is a diagram illustrating a screen that provides a scan UI window for acquiring a scan image and displaying a scan image and a memory guide bar according to a second embodiment of the present invention.

Referring to FIG. 3, first, the control unit 110 of the display apparatus 100 executes a scanner application (S310). The scanner application execution program may be stored in the storage unit 130, and a scan UI window displaying a scan image may be displayed according to the execution of the scanner application.

Next, through the input of the scanner function selection button 241 of the function input unit 240 of the input device 200, the input device 200 generates the scan start request signal, the display device 100 To send. Accordingly, the transmitted scan start request signal is received through the communication interface 150 of the display device 100 (S320).

In response to the scan start request signal, the input device 200 enters a scan mode and starts scanning to acquire a scan image from a scan target object (S330).

In the meantime, the scan UI window is displayed as the scanner application is executed in step S310. However, the step S310 may be omitted, and the scan UI window may be displayed according to the reception of the scan start request signal in step S320. Can be.

When scanning is started as the scan mode is entered, as illustrated in FIG. 5, the scan image 510 acquired and transmitted from the input device 200 is displayed on the scan UI window 500. In addition, as shown in FIG. 2B, a scan box 520 corresponding to the actual scan area 221 located on the back of the input device 200 is displayed on the scan UI window 500 ( S340).

The scan image 510 is an image obtained by the scanning unit 220 while the input device 200 moves to a position, and the scan image 510 is a position detection unit 230 of the input device 200. The data is matched with the position data acquired through the display and displayed on the scan UI window 500.

In the scan UI window 500, a memory guide bar 530 displaying the remaining amount of the storage unit 260 of the input device 200 is displayed. The memory guide bar 530 may display the remaining amount of memory excluding the capacity of the scan image relative to the total capacity of the input device 200. As the number of scanned images acquired through the scanning unit 220 of the input device 200 increases, the remaining amount of the storage unit 260 is displayed less.

On the other hand, the control unit 110 measures the moving speed of the information on the position movement of the input device 200 transmitted from the input device 200 (S350), and the measured moving speed of the input device 200 The critical speeds previously stored in the storage unit 130 are compared (S360). Here, the moving speed means a moving speed of the input device 200 and may be a moving speed of the scan box 520 corresponding to the moving of the input device 200.

As a result of the comparison, when the moving speed is smaller than the threshold speed, the scan operation is continued without changing the scan box 520.

Meanwhile, as a result of the comparison, when the moving speed is greater than the threshold speed, the display of the scan box 520 is changed (S370). Changing the display of the scan box 520 may be performed by changing the color of the outline of the scan box 520 or by changing the shape of the outline of the scan box 520.

For example, when the color of the outline of the scan box 520 displayed in the scan UI window 500 is green when the movement speed is less than or equal to the threshold speed, the scan UI is larger than or equal to the threshold speed. The color of the outline of the scan box 520 displayed on the window 500 may be changed from green to red.

This is to prevent the user from properly obtaining the scanned image from the scan target object when the user excessively moves the input device 200. Accordingly, the user detects that the color of the outline of the scan box 520 is changed and lowers the moving speed of the input device 200 to a threshold speed or less so that the scanned image can be properly obtained from the object to be scanned. do.

Meanwhile, the color or shape of the outline of the scan box 520 is changed and a predetermined warning message (not shown) may be displayed on the scan UI window 500, and a predetermined sound is output, whereby the input device is output. The user may be notified when the moving speed of the 200 is greater than or equal to the threshold speed.

Meanwhile, the memory guide bar 530 displayed on the scan UI window 500 will be described in more detail with reference to FIGS. 4 and 6.

Referring to FIG. 4, after the scan mode is started (S410), as shown in FIG. 6A, the input apparatus 200 may have a first position (X1, Y1) on the scan target object 10. The first scan image is acquired while moving through the first path Path A to the second positions X2 and Y2 (S420). In addition, the input device 200 moves through the second path Path B from the second position X2 and Y2 on the scan target object 10 to the third position X3 and Y3 while scanning the second scanned image. Obtain (S430).

It is determined whether there is an overlapping region in the first scan image acquired while moving on the first path A and the second scan image acquired while moving on the second path B (S440). ), If the overlapped area exists, the capacity of the entire scan image is calculated as the capacity of the entire scan image except the capacity of the overlapped area from the sum of the capacity of the first scan image and the capacity of the second scan image. As shown in FIG. 2, the remaining amount of the storage unit 260 of the input device 200 is displayed by displaying the calculated capacity of the entire scanned image relative to the total capacity of the input device 200 on the memory guide bar 530. Can be displayed.

Hereinafter, a method of displaying a scanned image according to a third embodiment of the present invention will be described with reference to FIGS. 7 to 12. 7 is a flowchart illustrating a method of displaying a scanned image according to a third embodiment of the present invention. 8 to 12 are diagrams illustrating a screen providing a scan UI window displaying a preview scan area for acquiring a scanned image according to a third exemplary embodiment of the present invention.

First, referring to FIG. 7, the control unit 110 of the display device 100 executes a scanner application (S710). The scanner application execution program may be stored in the storage unit 130, and a scan UI window is displayed according to the execution of the scanner application (S720).

Next, the controller 110 receives a scan area designation signal for designating a scan area for the scan target object from the input device 200 (S730). The preview scan area corresponding to the received scan area designation signal is displayed in the scan UI window (S740).

The scan area designation signal may be generated through a click signal for each of a plurality of points of the scan target object. For example, when the input device 200 generates signals for clicking on any one of the left and right buttons of the function input unit 240 while moving to four points 540a to d of the scan target object, respectively. As shown in (a) of FIG. 8, four points 540a to d of the scan target object are displayed on the scan UI window 500, and when the reception of the scan area designation signal is completed, FIG. As shown in FIG. 2, a preview scan area 541 is displayed on the scan UI window 500 in which four points 540a to d corresponding to the scan area designation signal are respectively connected.

The scan area designation signal may be generated through a click signal and a drag signal for a specific point of the scan target object. For example, the input device 200 generates a signal for clicking any one of the left and right buttons of the function input unit 240 at a specific point 550a of the object to be scanned, and generates a signal from the specific point 550a in a specific direction. When the drag operation is received, as shown in (a) of 9 8, the scan UI window 500 includes a drag operation indicator 550b from the specific point 550a and the specific point 550a of the object to be scanned. ) Is displayed and the reception of the scan area designation signal is completed, as shown in FIG. 9B, the drag operation indicator 550b corresponding to the scan area designation signal on the scan UI window 500. A preview scan area 551 corresponding to) is displayed.

As shown in FIG. 10, the preview scan area 551 and the scan box 520 are displayed on one side of the scan UI window 500 in a scan guide map 560 reduced in a predetermined ratio. Can be. The reduced scan box 521 may be displayed in the scan guide map 560 to identify a relative position of the scan box 520 in the preview scan area 551. Accordingly, the user can easily determine where the scan box 520 is located in the preview scan area 551. In particular, when the preview scan area 551 is larger than the size of the scan UI window 500, the scan guide map 560 makes it easier to scan the scan box 520 in the preview scan area 551. Identify relative positions

Meanwhile, a scan start request signal is received from the input device 200 (S760), and a scan operation for obtaining a scan image from the scan target object is performed according to the scan start request signal (S760).

The controller 110 of the display device 100 determines whether the scan box 520 corresponding to the position of the input device 200 is located in the preview scan area 551. It is determined through the position data transmitted from (S770).

As a result of the determination in step S770, when the scan box 520 is located in the preview scan area 551, a scan image of a scan target object corresponding to the preview scan area 551 is obtained, and the acquired The scanned image is displayed on the scan UI window 500 (S780).

On the other hand, if the scan box 520 is not located in the preview scan area 551 as a result of the determination in step S770, a notification message is provided to the user that the scan box 520 is outside the preview scan area 551. Output (S790). For example, as illustrated in FIG. 11, when the location 520b of the scan box moves to an area other than the preview scan area 551, a notification message is output to the user. The output of the notification message may be performed in various ways. The display of the scan box may be changed or a warning message may be displayed. In addition, by outputting a specific sound or by transmitting a signal relating to a specific vibration to the input device 200, the input device 200 to vibrate, it is possible to output a notification message to the user.

12, when a portion of the scan box 520b is located outside the preview scan area 551, a scan of a scan object corresponding to a portion of the scan box 520b is performed. The image may not be acquired.

That is, a scan operation may not be performed on a portion of the scan box 520b that is outside the preview scan area 551.

Hereinafter, a method of displaying a scanned image or an edited image according to a fourth embodiment of the present invention will be described with reference to FIGS. 13 to 16. 13 is a flowchart illustrating a method for displaying a scanned image or an edited image according to a fourth embodiment of the present invention. 14 is a diagram illustrating a state of an input device when a lift off signal is generated according to a fourth embodiment of the present invention. FIG. 15 is a diagram illustrating a screen providing a scan UI window in which a scan image is displayed according to a fourth embodiment of the present invention. FIG. 16 is a diagram illustrating a screen providing an edit UI window in which an edit image is displayed according to the fourth embodiment of the present invention.

First, referring to FIG. 13, the control unit 110 of the display device 100 executes a scanner application (S1310). The scanner application execution program may be stored in the storage unit 130, and a scan UI window is displayed according to the execution of the scanner application (S1320). Receiving the transmitted scan start request signal through the communication interface unit 150 of the display device 100, and enters the scan mode according to the scan start request signal, to obtain a scan image from the object to be scanned The scan is started (S1330).

In the scan mode, when the input device 200 is spaced apart from the scan target object by a predetermined distance or more, the controller 250 of the input device 200 generates a predetermined lift-off signal, The generated lift off signal is transmitted to the display device 200. The display device 200 receives the lift off signal (S1340). As illustrated in (a) of FIG. 14, the input apparatus 200 performs a scan operation on the scan target object 10, that is, a scan operation. As illustrated in FIG. 14B, when the input device 200 is spaced apart from the scan object 10 by a predetermined distance, the lift off signal is generated. Here, when the predetermined distance is 1 mm or more, the lift off signal may be generated.

The controller 110 of the display device 100 compares the duration of the received lift-off signal with a previously stored threshold time (S1350), and as a result of the comparison, the duration of the lift-off signal is greater than the specific threshold time. If it is small, it is determined that the scan mode continues, and as shown in FIG. 15, the scan UI window 500 displayed when the scan operation is performed is displayed (S1380). Thereafter, the scan image 510 and the scan box 520 obtained as the scan operation is performed are displayed on the scan UI window 500 (S1390).

Meanwhile, as a result of the comparison, when the duration of the lift-off signal is compared with a specific threshold time, the duration of the lift-off signal is greater than the specific threshold time, as shown in FIG. An edit UI window 600 including an edit image 610 for editing a scan image and an edit setting window 620 including a plurality of functions for editing the edited image is displayed.

Here, the edited image refers to a scan image obtained from the scan target object and finally determined for editing.

 Due to being displayed in the editing UI window 600, the switching mode from the scan mode to the editing mode, any one of a plurality of functions displayed on the edit setting window 620 to the edit image 600 displayed on the editing UI window 600 Edit using (S1370).

Hereinafter, a method of displaying a scanned image according to a state change signal of an input device according to a fifth embodiment of the present invention will be described with reference to FIGS. 17 to 19. 17 is a flowchart illustrating a method of displaying a scanned image according to a state change signal of an input device according to a fifth embodiment of the present invention. FIG. 18 is a diagram illustrating a screen providing a scan UI window displaying a scan image according to a state change signal of an input device according to a fifth embodiment of the present invention. 19 is a diagram illustrating a screen providing an edit UI window in which an edit image is displayed according to a state change signal of an input device according to a fifth embodiment of the present invention.

First, the input device 200 generates a state change signal through a button input of the function input unit 240 and transmits the state change signal to the display device 100. The display device 100 receives the state change signal (S1710). Here, the state change signal may be a zoom-in / out signal through a wheel or a wheel input signal among buttons of the function input unit 240.

In operation S1720, the controller 110 of the display device 100 determines whether the time when the state change signal is input is a scan mode or an edit mode.

As a result of the determination, when the time when the state change signal is input is the scan mode, that is, as illustrated in FIG. 18A, a scan UI window including the scan image 510 and the scan box 520 ( In the case of 500, any one of the size or direction of the displayed scan image 510 and the scan box 520 is adjusted and displayed according to the state change signal (S1730).

For example, when the state change signal is a zoom-in signal through the wheel of the input device, as shown in FIG. 18B, the scan image shown in FIG. The sizes of the 510 and the scan box 520 may be enlarged and displayed together at a predetermined ratio.

In addition, when the received state change signal is a zoom-out signal through the wheel of the input device 200, the size of the scan image 510 and the scan box 520 are together at a predetermined ratio. It may be reduced and displayed.

In addition, when the received state change signal is a wheel input signal through the wheel of the input device 200, the directions of the scan image 510 and the scan box 520 are rotated together at a predetermined angle to be displayed. Can be.

As a result of the determination, when the time point at which the state change signal is input is in the edit mode, that is, as shown in FIG. 19A, the edit UI window including the edit image 610 and the edit setting window 630 ( 600, the state of the displayed edited image 610 is adjusted and displayed according to the state change signal.

For example, when the state change signal is a zoom-in signal through the wheel of the input device, as shown in FIG. 19B, the edited image shown in FIG. 19A ( The size of the 610 may be enlarged and displayed together at a predetermined ratio.

In addition, when the received state change signal is a zoom-out signal through the wheel of the input device 200, the size of the edited image 610 may be enlarged and displayed together at a predetermined ratio.

Meanwhile, the function of the zoom in / out signal through the wheel of the input device 200 may be changed according to the user's setting. That is, the resolution of the edited image 610 may be changed and displayed in response to a zoom-in / out signal through the wheel of the input device 200.

Hereinafter, a method of editing a scan image according to a sixth embodiment of the present invention will be described with reference to FIGS. 20 through 24. 20 is a flowchart illustrating a method of editing a scanned image according to a sixth embodiment of the present invention. 21 to 24 illustrate screens for providing an editing UI window for editing a scan image according to a sixth exemplary embodiment of the present invention.

Referring to FIG. 20, first, the controller 110 of the display device 100 receives an edit request signal from the input device 200 through the communication interface 150 (S2010). The edit request signal may be generated by receiving a scanner function selection button 241 of the input device 200.

In response to the edit request signal, the edit UI window 600 is displayed (S2020). In the edit UI window 600, the edit image 610, an edit setting window 620 for editing the edit image, and a plurality of function icons for performing a plurality of functions are displayed in a specific region 640.

Here, the plurality of function icons may be icons for executing an electronic document function, an email function, and a social network service function.

The user selects a drag and drop icon on one side of the edit UI window 600 (S2030). The selection of the drag and drop icon means that the edited image is dragged from the edit UI window 600 to be dropped onto any one of a plurality of areas. Meanwhile, the step of selecting the drag and drop icon may be omitted.

That is, the drag and drop operation may be performed on the edited image 610 without selecting a separate drag and drop icon in the edit UI window 600.

Meanwhile, the user may select the edited image 610 through the pointer 20 such as a mouse, and then place the selected edited image 610 in the specific region 640 (S2040).

After dragging the edited image 610 through the pointer 20, when the edited image 610 is dropped to an area 640 in which one of a plurality of function icons is displayed, the edited image 610 is dropped. The edited image 610 is converted into image data and text data (S2050), and at the same time, a function icon corresponding to a region where the edited image 610 is dropped is executed (S2060).

In addition, while the function corresponding to the executed function icon is executed, the converted text data may be displayed in a format corresponding to the executed function (S2070).

Here, the specific area may be an area where the function icon is displayed.

For example, as shown in (a) of FIG. 21, when the edited image 610 is dragged, an operation of dropping the edited image 610 to a region where a function icon related to a word file function is displayed among a plurality of function icons is performed. As shown in FIG. 21B, a word file function is executed to display a word file window 800 and at the same time, text data 810 converted from the edited image in the word file window 800. May be displayed.

Meanwhile, the specific area may be an area in which the executed specific function is displayed.

For example, as illustrated in FIG. 22, when an icon related to an email function is selected from the plurality of function icons and an email function is executed, the email window 900 is displayed. Thereafter, when the edited image 610 displayed on the edit UI window 600 is dragged through the pointer 20, a drop operation is performed on an area where the executed specific function is displayed, that is, the e-mail window 900. As illustrated in FIG. 23, text data 910 converted from the edited image may be displayed in the email window 900.

Meanwhile, as shown in FIG. 24, when a specific character string 611 is selected from the text data converted from the edit image 610 displayed on the editing UI window 600 through the pointer 20, the specific character string 611. In response to the selection of), a search icon 650 may be displayed on one side of the selected specific character string 611. Subsequently, when an input for selecting the search icon 650 is received, the search site corresponding to the search icon 650 may be accessed to search for the selected specific character string to display a search result.

Although not shown, a translation icon may be displayed instead of the search icon 650. In this case, when an input for selecting a translation icon is received, a translation program corresponding to the translation icon may be executed to perform translation on the selected character string to display a translation result.

Next, an input device and an image correction method thereof according to another embodiment of the present invention will be described in detail with reference to FIGS. 25 to 29.

25 is a block diagram of an input device according to another embodiment of the present invention, FIG. 26 is a flowchart of an image correction method according to an embodiment of the present invention, and FIG. 27 is a flowchart illustrating a process of calculating frame coordinates. 28 is a diagram for explaining changes in a scanned image before and after applying an image correction method according to an embodiment of the present invention, and FIG. 29 is a diagram for describing a situation in which a lift-off signal is detected during scanning. to be.

Referring to FIG. 25, the input device 200 includes a function input unit 271, a scanning unit 272, a coordinate detector 273, a storage unit 274, a brightness calculator 275, and a lift signal detector 276. The communication interface 277 and the controller 278 may be included.

The function input unit 271 may receive an input signal for performing an operation corresponding to a specific function of the input device 200. That is, when the function input unit 271 receives an input signal for requesting to perform an operation in the mouse mode or the scan mode, the input device 200 may enter an operation mode corresponding thereto. It may include a scanner function selection button, a wheel button, and a left and right button disposed on the left and right sides of the wheel button.

The scanning unit 272 may obtain a scan image corresponding to a predetermined region of the scan target object by irradiating light onto a predetermined region of the scan target object and detecting light reflected therefrom. In this case, the unit screen of the acquired scan image may mean a frame.

The coordinate detector 273 may detect a plurality of coordinates corresponding to the plurality of pixels constituting the frame.

In particular, the coordinate detector 273 may detect the center coordinates of the frame and the corner coordinates of the frame. In one embodiment, the coordinate detector 273 may detect only the center coordinates of the frame, or may detect only all of the corner coordinates of the frame or only one of the corner coordinates.

The storage unit 274 may detect the center coordinates and the corner coordinates of the frame detected by the coordinate detector 273.

The brightness calculator 275 may calculate the overall brightness of the frame and the average brightness of the frame by using the plurality of coordinates corresponding to the detected plurality of pixels. The brightness calculator 275 may include a first brightness calculator 275a and a second brightness calculator 275b.

The first brightness calculator 275a may calculate the overall brightness of the frame, and a method of calculating the overall brightness of the frame will be described in detail with reference to FIG. 26.

The second brightness calculator 275b may calculate an average brightness of the frame, and a method of calculating the average brightness of the frame will be described in detail with reference to FIG. 26.

The lift signal detector 276 may detect a lift off signal that is a signal generated when the input device 200 is spaced apart from the scan target object by a predetermined distance or more. The lift signal detector 276 may include a gyro sensor.

The communication interface 277 may transmit the center coordinates of the frame, the corner coordinates of the frame, the overall brightness of the frame, the average brightness of the frame, and the acquired scan image to the display device 100 through wired or wireless communication.

The controller 278 may control the overall operation of the input device 200. The overall operation in the input device 200 can be controlled. For example, to transmit the scan image of the scan target object obtained from the scanning unit 272 and the center coordinates and corner coordinates of the frame obtained from the coordinate detector 130 to the external device through the communication interface 277. The control unit may control the scanned image and the position data to be stored in the storage unit 260.

Further, it is possible to control the function input unit 240 to transmit a signal related to various functions input by the user to an external device.

26 is a flowchart of an image correction method according to an exemplary embodiment.

Referring to FIG. 26, first, the input device 200 operates in a mouse mode (S3101). The mouse mode is a mode for controlling the operation of the display device. When the input device 200 is moved on a surface such as a desk, the mouse mode may be a mode for selecting a command or executing a program by using the cursor.

Thereafter, the input device 200 receives a scan start request signal (S3103).

In one embodiment, the scan start request signal may be received through the scanner function selection button 241 described with reference to FIG. That is, when the user presses the scanner function selection button 241 once, the mode of the input device 200 may be changed from the mouse mode to the scan mode. The scan mode may be a mode for acquiring a scan image through pixels of a scan target object received by irradiating light.

As described above, the scan start request signal may be generated through the scanner function selection button 241. However, the scan start request signal is not limited thereto and may be generated in various ways, such as in response to the selection of a specific application in the display device. Can be.

The input device 200 receives a scan start request signal and enters a scan mode (S3105).

The input device 200 scans the scan target object in operation S3107. The input device 200 may acquire a scan image of the scan target object through the scanning unit 272. The scanning unit 272 may obtain a scan image corresponding to a predetermined region of the scan target object by irradiating light onto a predetermined region of the scan target object and detecting light reflected therefrom.

According to an embodiment, the scan area scanned by the scanning unit 272 may have a rectangular shape as illustrated in FIG. 2B. But need not be limited thereto.

The object to be scanned is a target object containing information to be input / stored by the user in the input device 200, and generally refers to a document in which characters, pictures, and the like are displayed. In addition, the predetermined area of the scan object refers to a two-dimensional area having a predetermined area, unlike the existing line scan area.

The input device 200 may detect coordinates corresponding to the plurality of pixels constituting the frame (S3109).

The input device 200 may detect the center coordinates of the frame and the corner coordinates of the frame. The frame may be a unit screen of a scan image acquired through the input device 200. In one embodiment, when the scan area has a rectangular shape, one frame may also have a rectangular shape, and corner coordinates of the frame may be four vertex coordinates of the rectangle.

Referring to FIG. 27, one frame 30 corresponding to a scan area currently scanned by the scan target object 1000 and the input device 200 is illustrated. The edges of the frame are called A, B, C, and D, and the center of the frame is called E. In this case, the corner and center coordinates may be represented by A (x1, y1), B (x2, y2), C (x3, y3), D (x4, y4), and E (x5, y5), respectively.

In an embodiment, the input device 200 may detect the center coordinates and the corner coordinates of the frame through the coordinate detector 273.

According to an embodiment, the coordinate detector 273 may detect coordinates corresponding to a plurality of pixels constituting the frame 30.

In one embodiment, the coordinate detector 273 may detect a center coordinate and a corner coordinate of the frame through the laser sensor. The laser sensor may irradiate a laser beam to the scan target object and detect center coordinates and edge coordinates of the frame using the reflected reflected light.

In an embodiment, the laser sensor may detect only the center coordinate of the frame. That is, since one frame size is constant, the coordinate detector 273 may calculate corner coordinates based on a relative position between the center coordinates and the corner coordinates if only the center coordinates of the frame are known.

The input device 200 stores the coordinates of the detected frame (S3111). The input device 200 may store the coordinates of the frame detected through the storage unit 274. In an embodiment, the storage unit 274 may include a flash memory type, a hard disk type, a multimedia card micro type, and a card type memory (for example, SD or XD). Memory, etc.), RAM, ROM (e.g., EEPROM, etc.) at least one type of storage medium.

The input device 200 calculates the overall brightness of the frame which is the unit screen of the currently acquired scan image (S3113). The input device 200 may calculate the overall brightness of the frame through the first brightness calculator 275a. The first brightness calculator 275a may calculate the overall brightness of the frame using a plurality of coordinate points included in one frame. The plurality of coordinate points may correspond to a plurality of pixels constituting one frame.

The first brightness calculator 275a may calculate the overall brightness of the frame through Equation 1 below.

[Equation 1]

Figure pat00001

Here, Ik (x, y) may be a coordinate corresponding to one coordinate point included in the frame, and n may mean a number of coordinate points or a plurality of pixels included in one frame. The number of pixels constituting the one frame may vary depending on the resolution of the scan image to be acquired. That is, when a high resolution scan image is to be acquired, the number of pixels constituting the frame may be larger than when a low resolution scan image is to be obtained.

The first brightness calculator 275a may calculate the overall brightness F of the frame by adding up the brightness corresponding to the plurality of coordinate points constituting the frame.

The input device 200 calculates an average brightness of the frame by using the calculated overall brightness (S3115). The input device 200 may calculate the average brightness of the frame through the second brightness calculator 275b. The second brightness calculator 275b may calculate the average brightness S (B) of the frame through Equation 2 below.

&Quot; (2) "

Figure pat00002

That is, the second brightness calculators 275b and 275 may calculate the average brightness of the frame by dividing the calculated total brightness by the number n of the plurality of coordinate points constituting the frame.

The input device 200 transmits the calculated average brightness to the display device 100 (S3117). The input device 200 may transmit information about the average brightness calculated through the communication interface 277 to the display device 100. The communication interface 277 may perform wired or wireless communication with an external device, in particular, the display device 100 to transmit various input signals, video, audio, or data signals to the display device 100.

When the communication interface 277 performs wired communication with an external device, the communication interface 277 may be configured with serial, PS / 2, USB, and the like, and when performing wireless communication with the external device, The communication interface unit 276 may be configured of Radio Frequency Identification (RFID), Infrared Data Association (IrDA), Ultra Wideband (UWB), ZigBee, Digital Living Network Alliance (DLNA), and the like.

The display device 100 may receive the calculated average brightness information and apply the received average brightness to the scan image corresponding to the frame. That is, the display device 100 may receive information on the average brightness of the frame from the input device 200 and correct the overall brightness of the scanned image to the average brightness.

Referring to FIG. 28, changes in the scanned image before and after applying the image correction method according to an embodiment of the present invention are shown. Looking at the scanned image of one frame after image correction, it can be seen that the image becomes brighter than in the adjacent region K at the corners A, B, C, and D of the frame than after image correction. Each of the adjacent areas K may vary according to the light irradiated from the input device 200 and the moving direction of the input device 200.

In operation S3119, the input device 200 may determine whether a lift-off signal is generated. According to an embodiment, the lift-off signal may be a signal generated when the input device 200 is spaced apart from the scan target object by a predetermined distance T or more. The predetermined distance T or more may be 1 mm or more, but this is only an example.

In an embodiment, the input device 200 may determine whether a lift off signal is generated through the lift off signal detector 276. In an embodiment, the lift-off signal detector 276 may include a gyro sensor. The gyro sensor is a sensor for detecting the movement of the input device 200 in a direction perpendicular to the plane of the scan target object. The gyro sensor detects a change in the x, y, and z axes.

Referring to FIG. 29, when the input device 200 is spaced apart from the scan object 10 by a predetermined distance T or more, a lift off signal may be generated. In this case, the image to be scanned may become unclear due to the influence of the light irradiated from the outside rather than the light irradiated from the input device in the predetermined region 40 to be scanned.

Referring again to FIG. 26, if it is determined that the lift-off signal is generated (S3119), the input device 200 transmits information on the average brightness calculated before the lift signal is generated to the display device 100. . Thus, even when the input device 200 is lifted off, the average brightness is applied to the scanned image before the lift off is generated, thereby stably scanning the scan object.

On the other hand, when it is determined that the lift-off signal does not occur (S3119), the input device 200 maintains the scan mode and scans.

According to the image correction method according to an embodiment of the present invention, the brightness of the scanned image is not constant due to the limited illumination of the input device 200, so that the brightness may be darkened in a specific portion, thereby improving overall brightness. Evenly scanned images can be obtained.

Next, an input device and a scan image acquisition method thereof according to another embodiment of the present invention will be described in detail with reference to FIGS. 30 to 32.

30 is a block diagram of another input device of the present invention, FIG. 31 is a flowchart of a method of acquiring a scan image according to an embodiment of the present invention, FIG. 32 is a diagram illustrating the principle of a gyro sensor, and FIG. 33 is a diagram illustrating a process of obtaining an image of a scan target object by using an input device according to another embodiment of the present invention.

First, referring to FIG. 30, the input device 200 according to the embodiment of the present invention includes a function input unit 281, a scanning unit 282, a position detector 283, a position checker 284, and a positioning unit ( 285, a communication interface 286, a storage 287, and a controller 288.

The function input unit 281 may receive an input signal for performing an operation corresponding to a specific function of the input device 200. That is, when the function input unit 281 receives an input signal for requesting to perform an operation in the mouse mode or the scan mode, the input device 200 may enter an operation mode corresponding thereto. It may include a scanner function selection button, a wheel button, and a left and right button disposed on the left and right sides of the wheel button.

The scanning unit 282 may obtain a scan image corresponding to a predetermined region of the scan target object by irradiating light to a predetermined region of the scan target object and detecting light reflected therefrom. In this case, the unit screen of the acquired scan image may mean a frame.

The position detector 283 may detect position information of a frame which is a unit screen of an image of a scan target object. In an embodiment, the position detector 283 may include a laser sensor 283a and a gyro sensor 283b.

The laser sensor 283a may irradiate the laser light onto the scan target object and detect position information of the frame using the reflected reflected light.

The gyro sensor 283b may detect the position information of the frame using an angle at which the object is rotated about one axis of the input device 200.

The gyro sensor 283b may measure the angular velocity according to the movement of the input device 200. The measured angular velocity may be used to increase or decrease the frame rate, which will be described later.

The position detector 283 may detect the position information of the frame using both the laser sensor 283a and the gyro sensor 283b.

The laser sensor 283a and the gyro sensor 283b may be spaced apart by a predetermined distance.

The positioning unit 284 may check whether the position information of the frame detected by the laser sensor 283a is within an allowable range. The allowed range may mean a position range required for the input device 200 to normally acquire an image of a scan target object. That is, the position checking unit 284 may check whether the position information of the most recently detected frame is within the allowable range by comparing the position information of the frame detected immediately before the position information of the most recently detected frame. have.

When the position determiner 285 determines that the position information of the frame detected by the laser sensor 283a is not within the allowable range, the position determiner 285 may determine the position information detected by the gyro sensor 283b as the position information of the frame.

When the position determining unit 285 determines that the position information of the frame detected by the laser sensor 283a is not within the allowable range, the position determination unit 285 may determine the position information detected by the laser sensor 283a as the position information of the frame.

The communication interface 286 may transmit an image of the scan target object acquired to the external device (for example, the display device 100) based on the determined position information of the frame.

When the communication interface 286 performs wired communication with an external device, the communication interface 286 may be configured with serial, PS / 2, USB, and the like, and when performing wireless communication with the external device, The communication interface 286 may be configured of Radio Frequency Identification (RFID), Infrared Data Association (IrDA), Ultra Wideband (UWB), ZigBee, Digital Living Network Alliance (DLNA), and the like.

The storage unit 287 stores position information of a frame detected by the laser sensor 283a or the gyro sensor 283b. The storage 287 may store the gyro sensor 283b in correspondence with the angular velocity of the input device 200 and the frame rate.

The controller 288 may control the overall operation of the input device 200. Detailed operations of the controller 288 will be described later.

Referring to FIG. 31, first, the input device 200 operates in a mouse mode (S3301). The mouse mode is a mode for controlling the operation of the display device. When the input device 200 is moved on a surface such as a desk, the mouse mode may be a mode for selecting a command or executing a program by using the cursor.

Thereafter, the input device 200 receives a scan start request signal (S3303).

In one embodiment, the scan start request signal may be received through the scanner function selection button 241 described with reference to FIG. That is, when the user presses the scanner function selection button 241 once, the mode of the input device 200 may be changed from the mouse mode to the scan mode. The scan mode may be a mode for acquiring a scan image through pixels of a scan target object received by irradiating light.

As described above, the scan start request signal may be generated through the scanner function selection button 241. However, the scan start request signal is not limited thereto and may be generated in various ways, such as in response to the selection of a specific application in the display device. Can be.

The input device 200 receives a scan start request signal and enters a scan mode (S3305).

The input device 200 scans the scan target object by an external force capable of moving the input device 200 (S3307). The external force capable of moving the input device 200 may be a force applied to the input device 200 by the user, but is not limited thereto.

The input device 200 may acquire a scan image of the scan target object through the scanning unit 282. The scanning unit 282 may obtain a scan image corresponding to a predetermined region of the scan target object by irradiating light to a predetermined region of the scan target object and detecting light reflected therefrom.

According to an exemplary embodiment, the scan area scanned by the scanning unit 282 may have a rectangular shape as illustrated in FIG. 2B. There is no need to limit to this and may have various forms.

The object to be scanned is a target object containing information to be input / stored by the user in the input device 200, and generally refers to a document in which characters, pictures, and the like are displayed. In addition, the predetermined area of the scan object refers to a two-dimensional area having a predetermined area, unlike the existing line scan area.

The input device 200 may detect position information of the frame through the position detector 283 (S3309). Here, the frame may be a unit screen of the scan target object scanned by the input device 200. In an embodiment, the position detector 283 may detect the position information of the frame using one or more laser sensors 283a. In detail, the laser sensor 283a may irradiate the laser light to the scan target object and detect position information of the frame using the reflected reflected light. The laser sensor 283a may detect coordinates corresponding to a plurality of pixels constituting the frame and detect position information of the frame. The laser sensor 283a may detect only the center coordinates and the corner coordinates of the frame to detect the position information of the frame.

The laser sensor 283a may detect only the center coordinates of the frame and the corner coordinates of the frame. According to an embodiment, when the scan area of the input device 200 has a rectangular shape, one frame may also have a rectangular shape, and corner coordinates of the frame may be four vertex coordinates of the rectangle. This will be described with reference to FIG. 27.

Referring to FIG. 27, one frame 30 corresponding to a scan area currently scanned by the scan target object 1000 and the input device 200 or 200 is illustrated. The edges of the frame are called A, B, C, and D, and the center of the frame is called E. In this case, the corner and center coordinates may be represented by A (x1, y1), B (x2, y2), C (x3, y3), D (x4, y4), and E (x5, y5), respectively.

The laser sensor 283a may detect the position coordinates of the frame by detecting the center coordinates E and the corner coordinates A, B, C, and D of the frame. Since the size of one frame is constant in one embodiment, the laser sensor 283a may detect only the center coordinate E to detect the position information of the frame.

In another embodiment, the position detector 283 may detect the position information of the frame through the gyro sensor 283b. The gyro sensor 283b is a sensor that informs a numerical value of an angle of rotation of an object about one axis, and is a sensor capable of measuring the angular velocity of the object while the object is moving. Referring to Fig. 31, the principle of the gyro sensor 283b will be described in detail.

Referring to FIG. 32, when the input device 200 scans the scan target object 10, the gyro sensor may detect the movement of the input device 200 by detecting the angular velocity. As shown in FIG. 32, the gyro sensor moves in the x-axis direction of the input device 200, in the y-axis direction perpendicular to the x-axis direction, and z-axis perpendicular to both the x-axis and the y-axis. By detecting the movement in the direction, the position information of the frame can be detected.

Referring again to FIG. 30, in an embodiment, the gyro sensor 283b may be spaced apart from the laser sensor 283a by a predetermined distance.

According to an embodiment, the input device 200 may detect the position information of the frame using the laser sensor 283a and the gyro sensor 283b simultaneously.

The position checking unit 284 checks whether the position information of the frame detected by the laser sensor 283a is within an allowed range (S3311). The allowed range may mean a position range required for the input device 200 to normally acquire an image of a scan target object. That is, the position checking unit 284 may check whether the position information of the most recently detected frame is within the allowable range by comparing the position information of the frame detected immediately before the position information of the most recently detected frame. have.

If it is confirmed that the position information of the frame detected by the laser sensor 283a is not within the allowable range, the positioning unit 285 determines the position information of the gyro sensor 283b as the position information of the frame. (S3313). That is, when it is determined that the positional information of the frame detected by the laser sensor 283a is not correct, the positioning unit 285 does not determine the positional information detected by the laser sensor 283a as the positional information of the frame, The position information detected by the sensor 283b may be determined as position information of the frame.

Referring to FIG. 33 in detail, as illustrated in FIG. 33, when a user scans the book 13 through the input device 200, the laser sensor at the portion 14 where the book 13 is folded. 283a cannot obtain position information of the correct frame. When the input device 200 is located at the portion 14 where the book 13 is folded, the laser sensor 283a is separated from the book 130, which is the object to be scanned, and the input device 200 by a predetermined distance. ) May not detect the position information of the frame properly through the laser light reflected and reflected by the scan target object, or may not detect the position information of the frame at all.

That is, according to an exemplary embodiment of the present invention, when the position information detected by the laser sensor 283a is determined to be inaccurate or when the laser sensor 283a does not detect the position information of the frame, the position information detected by the gyro sensor 283b is detected. It is possible to stably obtain an image of the object to be scanned by determining the position information of the frame.

In addition, according to an embodiment of the present invention, unlike the one using two laser sensors 283a which are arranged at regular intervals, the laser device 283a is used. There is no

This will be described with reference to FIG. 30 again.

On the other hand, when it is confirmed that the positional information of the frame detected by the laser sensor 283a is within the allowable range, the positioning unit 285 determines the positional unit 285 and the position information detected by the laser sensor 283a. It is determined by the position information of the frame (S3315).

Thereafter, the input device 200 acquires a scan image corresponding to the frame based on the determined position information of the frame (S3317), and transmits the acquired scan image to the display device 100 (S3319).

On the other hand, the gyro sensor 283b may check whether a lift-off signal is generated (S3321). According to an embodiment, the lift-off signal may be a signal generated when the input device 200 is spaced apart from the scan target object by a predetermined distance T or more. The predetermined distance T or more may be 1 mm or more, but this is only an example.

If it is determined that the lift-off signal has occurred, the input device 200 operates in a mouse mode (S3301).

On the other hand, when it is confirmed that the lift off signal does not occur, the input device 200 maintains the operation in the scan mode, the scan proceeds.

Next, a method of adjusting the frame rate of the input device 200 according to another embodiment of the present invention will be described with reference to FIG. 34.

34 is a flowchart illustrating a method of adjusting a frame rate of the input device 200 according to an embodiment of the present invention.

Referring to FIG. 34, the input device 200 operates in a mouse mode (S3501). The mouse mode is a mode for controlling the operation of the display device. When the input device 200 is moved on a surface such as a desk, the mouse mode may be a mode for selecting a command or executing a program by using the cursor.

Thereafter, the input device 200 receives a scan start request signal (S3503).

The scan start request signal may be generated through the scanner function selection button 241 through the scanner function selection button 241 described in FIG. 2A, but is not limited thereto. Correspondingly, it may be generated in a variety of ways.

The input device 200 receives a scan start request signal and enters a scan mode (S3505).

The input device 200 scans the scan target object by an external force capable of moving the input device 200 (S3507). The external force capable of moving the input device 200 may be a force applied to the input device 200 by the user, but is not limited thereto.

The gyro sensor 283b measures the angular velocity of the input device 200 (S3509). That is, when the input device 200 moves to scan the object to be scanned, the gyro sensor 283b may measure the angular velocity according to the movement of the input device 200.

The controller 288 checks whether the measured angular velocity exceeds the critical angular velocity (S3509). According to an embodiment, the threshold angular velocity may be a reference value for increasing or decreasing the frame rate of the input device 200. Here, the frame rate may be an index indicating how many times per second the image acquired through the input device 200 is converted into a digital image signal. That is, if the frame rate is 60, this means that 60 images per second of the image formed on the image device are converted into digital image signals.

If the measured angular velocity exceeds the critical angular velocity, the controller 288 increases the frame rate of the input device 200 (S3513). Accordingly, even if the angular velocity of the input device 200 is increased, the frame rate is increased, so that an image of the scan target object can be stably obtained. Here, the frame rate can be increased according to the measured angular velocity. Specifically, in an embodiment, the storage unit 287 may store the angular velocity and the frame rate in correspondence with each other, and the controller 288 searches for the storage unit 287 and inputs the frame rate corresponding to the measured angular velocity. The device 200 may be controlled.

If the measured angular velocity does not exceed the critical angular velocity, the controller 288 decreases the frame rate of the input device 200 (S3515). Accordingly, when the angular velocity of the input device 200 is slowed down, the frame rate is reduced to efficiently use the memory capacity of the input device 200. Here, the frame rate can be reduced according to the measured angular velocity. Specifically, in an embodiment, the storage unit 287 may store the angular velocity and the frame rate in correspondence with each other, and the controller 288 searches for the storage unit 287 and inputs the frame rate corresponding to the measured angular velocity. The device 200 may be controlled.

Next, an input device and a control method thereof according to another embodiment of the present invention will be described with reference to FIGS. 35 to 38.

In addition, when necessary, an input device and a control method thereof according to another embodiment of the present invention will be described with reference to FIGS. 1 to 34.

35 is a block diagram of an input device according to another embodiment of the present invention, FIG. 36 is a flowchart illustrating a control method of the input device according to an embodiment of the present invention, and FIG. 37 is a moving speed of the input device. FIG. 38 is a diagram illustrating an algorithm code for implementing a control method of an input device according to an embodiment of the present disclosure.

First, referring to FIG. 35, the input device 200 according to another exemplary embodiment of the present invention may include a function input unit 291, a coordinate detector 292, a scanning unit 293, a speed measuring unit 294, and a resolution measurement. The controller 295 may include a frame rate determiner 296, a communication interface 297, a storage 298, and a controller 299.

The function input unit 291 may receive an input signal for performing an operation corresponding to a specific function of the input device 200. That is, when the function input unit 281 receives an input signal for requesting to perform an operation in the mouse mode or the scan mode, the input device 200 may enter an operation mode corresponding thereto. It may include a scanner function selection button, a wheel button, and a left and right button disposed on the left and right sides of the wheel button.

The coordinate detector 292 may detect the coordinates of the frame. In one embodiment, the coordinate detector may detect the coordinates of the frame through the laser sensor, but is not limited thereto. A process of detecting the coordinates of the frame will be described later.

The scanning unit 293 may acquire a scan image corresponding to a predetermined area of the scan target object based on the coordinates of the detected frame.

The light may be irradiated to a predetermined area of the scan target object, and the light reflected from the scan object may be detected to acquire a scan image corresponding to the predetermined area of the scan target object. In this case, the unit screen of the acquired scan image may mean a frame.

The speed measuring unit 294 may measure a moving speed of the input device 200. In an embodiment, the input device 200 may measure a moving speed of the input device 200 through a speed sensor. According to another embodiment, the input device 200 may measure a change in movement speed of the input device 200 through an acceleration sensor.

The resolution measuring unit 295 may measure the resolution of the scanned image acquired through the scanning unit 293.

The frame rate determiner 296 may determine a frame rate according to the measured moving speed of the input device 200. The storage unit 297, which will be described later, may include a frame rate according to the moving speed of the input device 200 as a look-up table, and store the moving rate and the frame rate in association with each other. The frame rate determining unit 295 may include a storage unit. In operation 297, the frame rate corresponding to the measured moving speed of the input device 200 may be determined.

The frame rate determiner 296 may determine a frame rate based on the measured resolution of the scanned image, which will be described later.

The communication interface 297 may transmit an image of the scan target object acquired to the external device (eg, the display device 100) based on the detected frame coordinates.

When the communication interface 297 performs wired communication with an external device, the communication interface 297 may be configured with serial, PS / 2, USB, and the like, and when performing wireless communication with the external device, The communication interface 286 may be configured of Radio Frequency Identification (RFID), Infrared Data Association (IrDA), Ultra Wideband (UWB), ZigBee, Digital Living Network Alliance (DLNA), and the like.

The storage unit 298 may store frame coordinate data, a unit image of the scan target object corresponding to the coordinate data, a resolution of the unit image, and the like.

In particular, the storage unit 298 includes a frame rate according to the moving speed of the input device 200 as a lookup table, and stores the moving speed and the frame rate in correspondence with each other.

The controller 299 may control overall operations of the input device 200. The controller 299 may correspond to the contents of the controller 210 described with reference to FIG. 1.

Next, FIG. 36 will be described with reference to FIG. 35 for a method of controlling an input device according to an exemplary embodiment.

Referring to FIG. 36, first, the input device 200 operates in a mouse mode (S3701). The mouse mode is a mode for controlling the operation of the display device. When the input device 200 is moved on a surface such as a desk, the mouse mode may be a mode for selecting a command or executing a program by using the cursor.

Thereafter, the function input unit of the input device 200 receives a scan start request signal (S3703).

In one embodiment, the scan start request signal may be received through the scanner function selection button 241 described with reference to FIG. That is, when the user presses the scanner function selection button 241 once, the mode of the input device 200 may be changed from the mouse mode to the scan mode. The scan mode may be a mode for acquiring a scan image through pixels of a scan target object received by irradiating light.

The scan start request signal may be generated through the scanner function selection button 241, but is not limited thereto. The scan start request signal may be generated in various ways, such as in response to selection of a specific application displayed on the display device 100. Can be.

The input device 200 receives a scan start request signal and enters a scan mode (S3705). The scan mode is a mode for controlling the operation of the input apparatus 200. Specifically, when the input apparatus 200 moves on the surface of the scan target object, a scan mode may be used to acquire an image of the scan target object. Can mean.

The input device 200 starts scanning the object to be scanned by an external force that can move the input device 200 (S3707). The external force capable of moving the input device 200 may be a force applied to the input device 200 by the user, but is not limited thereto.

According to an embodiment, the scan area 221 may have a rectangular shape as illustrated in FIG. 2B. But need not be limited thereto.

The object to be scanned is a target object containing information to be input / stored by the user in the input device 200, and generally refers to a document in which characters, pictures, and the like are displayed. The predetermined region of the scan target object may mean a two-dimensional region having a predetermined area, unlike the existing line scan region.

The coordinate detector 292 of the input device 200 detects coordinates of the frame (S3709). In an embodiment, the frame may mean a unit screen of the scan target object corresponding to the scan area 221 illustrated in FIG. 2B. According to an embodiment, the coordinate detector 292 may detect coordinates corresponding to a plurality of pixels constituting the frame. In an embodiment, the coordinate detector 292 may detect the center coordinates of the frame and the corner coordinates of the frame. Here, when the scan area 221 has a rectangular shape, as shown in Fig. 2B, the frame also has a rectangular shape. In this case, the center coordinates of the frame may be the center coordinates of the rectangle, and the corner coordinates of the frame may be the coordinates of four vertices of the rectangle.

In one embodiment, the coordinate detector 292 may detect the coordinates of the frame through the laser sensor, but is not limited thereto. The laser sensor may irradiate a laser beam onto the scan target object and detect coordinates by using the reflected reflected light.

Since the description of the method of detecting the coordinates of the frame has been described with reference to FIG. 27, detailed description thereof will be omitted.

The storage unit 298 of the input device 200 stores the detected coordinates (S3711). In an embodiment, the storage unit 298 may include a flash memory type, a hard disk type, a multimedia card micro type, and a card type memory (for example, SD or XD). Memory, etc.), RAM, ROM (e.g., EEPROM, etc.) at least one type of storage medium.

The scanning unit 293 of the input device 200 obtains a unit image of the scan target object corresponding to the frame based on the detected coordinates in operation S3713.

The storage unit 298 of the input device 200 stores the obtained unit image (S3715).

The controller 299 of the input device 200 checks whether the coordinates of the frame are the same in the subsequent scanning process (S3717).

If it is determined that the coordinates of the frame are the same, the controller 299 of the input device 200 does not store the unit image of the scan target object based on the coordinates of the frame (S3719). Here, when the coordinates of the frames are the same, the reason for not storing the unit image of the scan target object corresponding to the frame is to prevent unnecessary storage capacity. Referring to the algorithm code of FIG. 38, while the input device 200 operates in the scan mode, the input device 200 stores the coordinates of the detected frame in a buffer, and if the frame coordinates are the same, the coordinates of the frame. It can be seen from the following text that the unit image of the scan target object is not stored.

while (Scan mode ()) {

Buffer (Xi, Yj) = InputImage (x, y);

if (Buffer (Xi, Yj) == InputImage (x, y)) {

NotSaveAsChangedImage (ROI); }

If it is determined that the coordinates of the frame are not the same, the speed measuring unit 294 of the input device 200 measures the moving speed of the input device 200 (S3721). According to an embodiment, the speed measurer 294 may measure a moving speed of the input device 200 through a speed sensor. In another embodiment, the speed measuring unit 294 may measure a change in movement speed of the input device 200 through an acceleration sensor.

Here, when it is determined that the coordinates of the frame are not the same, after the steps S3709 to S3715 are performed, step S3721 may be performed.

As shown in FIG. 37, the moving speed of the input device 200 is a speed v1 and a y-axis moving in the x-axis direction when the input device 200 scans the unit image K of the scan target object. It may mean an average speed of the speed (v2) moving in the direction.

Referring back to FIG. 36, the controller 299 of the input device 200 checks whether the measured moving speed of the input device 200 is greater than or equal to a predetermined maximum speed (S3723).

If the measured movement speed of the input device 200 is greater than or equal to a predetermined maximum speed, the frame rate determination unit 296 of the input device 200 determines a frame rate corresponding to the maximum speed (S3725). That is, the frame rate determiner 296 may search the storage 298 to search for a frame rate corresponding to the maximum speed, and determine the retrieved frame rate as the frame rate of the input device 200. The storage unit 298 may include a frame rate according to the maximum speed and the moving speed of the input device 200 as a look-up table and store the moving speed and the frame rate in correspondence with each other.

When the moving speed of the input device 200 is greater than or equal to the maximum speed, the reason for determining the frame rate of the input device 200 at the frame rate corresponding to the maximum speed is that the frame is increased in accordance with the increase in the moving speed of the input device 200. This is because if the rate is continuously increased, the capacity of the memory is greatly consumed.

Here, the frame rate may mean the number of shots of the scan target object per second. As the frame rate unit, fps (frame per second) is used, which means frames per second. That is, if the frame rate is 30fps, the input device 200 may photograph the scan target object 30 times per second and convert the scanned object into a digital signal. When the frame rate is 60fps, the data rate is twice as large as when the frame rate is 30fps.

Referring to the algorithm code of FIG. 38, the frame rate corresponding to the maximum speed is MaxFrameRate = 30; You can see it in the text, and the value is 30 fps. Here, 30 fps is only an example.

In addition, it can be seen that the performance of step S3725 is performed through the following phrase.

else if (FrameRate> MaxFrameRate)

FramteRate = MaxFramteRate;

On the other hand, the control unit 299 of the input device 200 checks whether the measured moving speed of the input device 200 is less than or equal to a predetermined minimum speed (S3727).

If the measured moving speed of the input device 200 is less than or equal to a predetermined minimum speed, the frame rate determination unit 296 of the input device 200 determines a frame rate corresponding to the minimum speed (S3729). Referring to the algorithm code of FIG. 38, the frame rate corresponding to the minimum speed is FixedFrameRate = 5; You can see it in the text, which is 5 fps. 5 fps is merely an example.

In addition, it can be confirmed that the performance of step S3727 is performed through the following phrase.

if (FrameRate <FixedFrameRate)

FrameRate = FixedFrameRate ();

That is, the frame rate determiner 296 may search for the frame rate corresponding to the minimum speed through the storage 298 and determine the retrieved frame rate as the frame rate of the input device 200.

Referring back to FIG. 36, when the measured moving speed of the input device 200 is greater than or equal to a predetermined minimum speed, the frame rate determiner 296 of the input device 200 determines a frame rate corresponding to the measured moving speed. It is determined (S3731). That is, the input device 200 may search the frame rate corresponding to the moving speed of the input device 200 through the storage unit 298, and determine the retrieved frame rate as the frame rate of the input device 200. .

Referring to the algorithm code of FIG. 38, it can be seen that the frame rate is adjusted according to the moving speed of the input device 200. That is, when the moving speed of the input device 200 increases, the frame rate increases, and when the moving speed of the input device 200 decreases, the frame rate decreases. This can be confirmed by the following text.

for (;;) {

   Case (S (InputImage (x, y) ++):

then Z ++;

Case (S (InputeImage (x, y)-);

       then Z--;

As described above, the input device 200 may efficiently use the capacity of the memory by adjusting the frame rate according to the moving speed, and stably obtain an image of the object to be scanned. That is, when the moving speed of the input device 200 is slow, the frame rate is lowered, and when the moving speed of the input device 200 is fast, the frame rate is increased to efficiently manage the capacity of the memory.

Thereafter, the input device 200 scans the scan target object at the determined frame rate (S3733).

Next, referring to FIG. 39, a control method of an input device according to another embodiment of the present disclosure will be described with reference to FIGS. 1 to 38.

Referring to FIG. 39, first, the input device 200 operates in a mouse mode (S3901).

Thereafter, the function input unit 291 of the input device 200 receives a scan start request signal (S3903). Details are the same as described with reference to FIG.

The input device 200 receives a scan start request signal and enters a scan mode (S3905). Details are the same as described with reference to FIG.

The input device 200 starts scanning the object to be scanned by an external force capable of moving the input device 200 (S3907). Details are the same as described with reference to FIG.

The coordinate detector 292 of the input device 200 detects coordinates of the frame (S3909). Details are the same as described with reference to FIG.

The storage unit 298 of the input device 200 stores the detected coordinates (S3911).

The scanning unit 293 of the input device 200 obtains a unit image of the scan target object corresponding to the frame based on the detected coordinates (S3913).

The storage unit 298 of the input device 200 stores the obtained unit image in operation S3915.

The controller 299 of the input device 200 checks whether the coordinates of the frame are the same in the subsequent scanning process (S3917).

If it is determined that the coordinates of the frames are the same, the controller 299 of the input device 200 does not store the unit image of the scan target object based on the coordinates of the frame (S3919).

If it is determined that the coordinates of the frame are not the same, the resolution measuring unit 295 of the input device 200 measures the resolution of the stored unit image (S3921).

Here, the resolution of the unit image may mean an index indicating the precision of the image. Specifically, the resolution of the unit image is an index indicating how many pixels or dots are used to represent the image. DPI (Dots Per Inch) is used as a unit of the resolution of the unit image, and DPI may mean the number of pixels or dots falling into 1 inch (2.54 cm). Referring to the algorithm code of FIG. 38, it can be seen that the resolution of the unit image is performed through the following phrase.

DPI = EstimateDPI (CurrentFrame, Previous ReconstructedDPI);

Here, the image quality of FIG. 38 may mean a resolution.

When it is determined that the coordinates of the frame are not the same, after step S3909 to S3915 is performed, step S3921 may be performed.

Referring to FIG. 39 again, the controller 299 of the input device 200 checks whether the resolution of the measured unit image is the same as the reference resolution (S3923).

If the resolution of the measured unit image is the same as the reference resolution, the controller 299 of the input device 200 maintains the current frame rate (S3925). Here, the current frame rate may be a predetermined frame rate when the input device 200 operates in the scan mode.

Referring to the algorithm code of FIG. 38, it can be seen that adjustment of the frame rate is performed through the following phrase.

else if (DPI == CorrectDPI)

FrameRate = FixedFrameRate ();

If the resolution of the measured unit image is not the same as the reference resolution, the controller 299 of the input device 200 checks whether the resolution of the measured unit image is greater than the reference resolution (S3927).

If the resolution of the measured unit image is larger than the reference resolution, the input device 200 decreases the frame rate by 1/2 of the current frame rate (S3929).

Referring to the algorithm code of FIG. 38, it can be seen that adjustment of the frame rate is performed through the following phrase.

if (DPI> CorrectDPI)

FrameRate = FrameRate / 2;

If the resolution of the measured unit image is smaller than the reference resolution, the input device 200 increases the frame rate (S3931).

Referring to the algorithm code of FIG. 38, it can be seen that adjustment of the frame rate is performed through the following phrase.

else if (DPI <CorrectDPI)

FrameRate = FrameRate + 1;

As described above, the input device 200 may efficiently use the capacity of the memory of the memory by adjusting the frame rate according to the resolution of the obtained unit image, and stably acquire the image of the object to be scanned. That is, when the resolution of the unit image acquired through the control method of the input device 200 according to an embodiment of the present invention is low, it is possible to obtain a scan image having a reference resolution by increasing the frame rate and obtaining the unit image. If the resolution is too high, the frame rate can be lowered to prevent unnecessary waste of memory capacity.

Thereafter, the input device 200 scans the scan target object at a frame rate maintained or changed (increased or decreased) (S3933).

Next, an input device and a control method thereof according to another embodiment of the present invention will be described with reference to FIGS.

Hereinafter, an input device and a control method thereof according to another embodiment of the present invention will be described with reference to FIGS. 1 to 39.

40 is a block diagram of an input device according to another embodiment of the present invention, FIG. 41 is a block diagram of a terminal that can be connected to an input device according to another embodiment of the present invention, and FIG. 43 is a flowchart illustrating a control method of an input device according to another embodiment, FIG. 43 is a diagram for describing a reference specification of a graphic processor, and FIG. 44 is a graphic according to a control method of an input device according to another embodiment of the present invention. FIG. 4 is a diagram for describing selection of a rendering method according to a utilization rate of a processing unit and a utilization rate of a central processing unit.

First, referring to FIG. 40, the input device 300 includes a function input unit 310, a coordinate detector 320, a unit image acquirer 330, a storage unit 340, a specification checker 350, and a render selector. 360, the object image obtaining unit 370, the usage state checking unit 380, and the control unit 390 may be included.

The function input unit 310 includes a scanner function selection button, a wheel button, and a left / right button centered on the wheel button.

When the user inputs the scanner function selection button, the input device 30 generates a scan start request signal for entering the scan mode and an edit request signal for switching from the scan mode to the edit mode.

In the case of the wheel button and the left / right button, a signal corresponding to the function assigned to each of the scan mode and the edit mode is generated.

The function input unit 310 may correspond to the function input unit 240 of FIGS. 1 and 2.

The coordinate detector 320 may detect a coordinate of a frame that is a unit image of the scan target object as the input device 300 moves. In one embodiment, a laser sensor may be used as the coordinate detector 320, but is not limited thereto.

The unit image obtaining unit 330 may obtain a unit image of the scan target object corresponding to the frame based on the detected coordinates.

The storage unit 340 may store unit images of the scan target object corresponding to the frame coordinates and the frame coordinates detected by the coordinate detector 320.

The specification checking unit 350 may check the specifications of the terminal 400 connected to the input device 300. In detail, the specification checking unit 350 may check the specifications of the graphic processing unit 440 included in the terminal 400.

The specification checking unit 350 may check the version of the application program interface of the terminal 400 as well as the specification of the graphic processing unit.

The rendering selector 360 may select hardware rendering or software rendering according to the specification of the terminal 400 confirmed by the specification checking unit 350.

The rendering selector 360 may select hardware rendering or software rendering according to the use state of the graphic processing unit 440 and the use state of the central processing unit 450 which are confirmed by the use state checking unit 380, which will be described later. This will be described later.

The object image acquisition unit 370 may merge unit images of the scan target object according to a rendering method selected by the rendering selection unit 360 to obtain an image of one completed scan target object.

The use state checking unit 380 may check the use state of the graphic processing unit 440 and the central processing unit 450 of the terminal 400. This will be described later.

The communication interface unit 390 may transmit various input signals, images, and audio signals collected by the input device 300 to the communication interface unit 430 of the terminal 400 through wired or wireless communication.

The controller 391 may control the overall operation of the input device 300. Specific operations of the controller 391 will be described later.

Referring to FIG. 41, the terminal 400 includes a display unit 410, a scan UI generating unit 420, a communication interface unit 430, a graphic processing unit 440, a central processing unit 450, and a control unit 460. can do.

In an embodiment, the terminal 400 may be a computer, a digital TV, a portable terminal, or the like, but is not limited thereto.

The display unit 410 may generate driving signals by converting various image signals, data signals, OSD signals, and the like into R, G, and B signals, respectively. To this end, the display unit 410 may use a PDP, an LCD, an OLED, a flexible display, a 3D display, or the like, or may be configured as a touch screen and used as an input device in addition to the output device. It is possible.

The scan UI generator 420 generates a scan UI window and an edit UI window for displaying the implementation state of the scan driver program on the screen. The generated scan UI window and the edit UI window are displayed on the screen through the display unit 410, and the user controls the scan operation of the input device 300 through the scan UI window. In addition, various scan control commands are generated by operating various function setting buttons provided in the scan UI window. In addition, various editing control commands are generated by operating various function setting buttons provided in the editing UI window.

The communication interface unit 430 may perform wired or wireless communication with an external device to receive various input signals, video, audio, or data signals from the external device.

When the communication interface unit 430 performs wired communication with an external device, the communication interface unit 430 may be configured with serial, PS / 2, USB, etc., and when performing wireless communication with an external device, The communication interface unit 430 may include radio frequency identification (RFID), infrared data association (IrDA), ultra wideband (UWB), ZigBee, digital living network alliance (DLNA), and the like.

The graphic processor 440 may process an image signal received from the input device 300 to display an image of the scan target object on the display 410.

The graphic processor 440 may include a graphic controller 441 for controlling the overall operation of the graphic processor 440 and a graphic memory 443 capable of storing image signals received from the input device 300.

The central processing unit 450 may process operations of application programs that may be performed in the terminal 400.

The controller 460 may control the overall operation of the terminal 400. For example, the communication interface 430 controls to receive various input signals input from the outside and various data transmitted from the outside, and processes the received various input signals and the received data to process the processed signals or data. The display unit 410 may control the display. Specific operations of the controller 460 will be described later.

Next, an image processing method of an input device according to an embodiment of the present invention will be described with reference to FIGS. 42 to 44.

The input device 300 may communicate with the terminal 400 through wired or wireless, and the terminal 400 may include all the configurations of the display device 100 described with reference to FIG. 1.

First, the input device 300 operates in a mouse mode (S4101).

Thereafter, the function input unit 310 of the input device 300 receives a scan start request signal (S4103).

Accordingly, the input device 300 operates in the scan mode (S4105). That is, the input device 300 may be placed in a state capable of scanning the scan target object.

The coordinate detector 320 of the input device 300 detects coordinates of a frame that is a unit image of the scan target object as the input device 300 moves (S4107).

The storage unit 340 of the input device 300 stores the detected coordinates (S4109).

The unit image obtaining unit 330 of the input device obtains a unit image of the scan target object corresponding to the frame based on the detected coordinates in operation S4111. As the input device 300 scans the object to be scanned, the unit image acquisition unit 330 of the input device 300 may acquire unit images corresponding to a plurality of frames.

Thereafter, the specification confirming unit 350 of the input device 300 checks the specifications of the terminal 400 (S4113). In detail, the specification checking unit 350 may check the specifications of the graphic processor 440 of the terminal 400 connected to the input device 300. The graphic processor 440 converts the image data received from the input device 300 into an image signal so as to output an image corresponding to the scan target object to the display unit 410 of the terminal 400. Here, the graphic processor 440 may mean a graphic card.

In an embodiment, the specification checker 350 may check the version of the application program interface of the terminal 400 as well as the specification of the graphic processor 440. Here, the application program interface may mean Open GL (Open Graphics Library) which is a standard application program interface standard of 2D and 3D graphics.

Thereafter, the controller 391 of the input device 300 checks whether the confirmed specification of the terminal 400 is greater than or equal to the reference specification (S4115). In an exemplary embodiment, as shown in FIG. 43, when the graphic processing unit 440 of the terminal 400 is built-in, when the memory of the graphic processing unit 440 is 384M or more and the graphic processing unit 440 is external The memory of the graphic processor 440 may be 128M or more, but this is merely an example. In addition, the reference specification may mean that the version of Open GL of the terminal 400 is 1.4 or more, but this is only an example.

If the confirmed specification of the terminal 400 is greater than or equal to the reference specification, the rendering selector 360 of the input device 300 selects hardware rendering (S4117). Here, rendering refers to a process of generating a final screen in two dimensions by processing a unit image corresponding to a plurality of frames in consideration of already set modeling, motion, camera, texture mapping, and lighting.

Rendering can be done in various ways, such as radiosity, raytracing, scanline, phong, etc., depending on the mathematical algorithm used. Flies The input device 300 according to another embodiment of the present invention may use all of the rendering schemes.

Radio City is a method of calculating the reflected and refracted light according to the geometrical characteristics of the surface by looking at the surface of the object to be scanned as a temporary light source in such a way that the photographic specificity produces the strongest result.

Ray tracing is a way of calculating the actual motion of a ray starting from a light source and reaching the user's eye.

The scanline method renders a scene as a series of scanlines generated from the top to the bottom, and has an advantage of obtaining a completed image at high speed.

Pung is a method of obtaining a completed image by separately calculating the colors of each pixel of the object to be scanned.

In addition, the rendering method may be classified into a software rendering method and a hardware rendering method depending on whether software or hardware is used.

The hardware rendering is performed by merging the unit images corresponding to the plurality of frames acquired by the input device 300 using the memory of the graphic processor 440 to obtain one completed image. The terminal 400 may perform multitasking. This is an efficient way to improve performance.

Software rendering refers to a method of acquiring one completed image by merging unit images corresponding to a plurality of frames through a central processing unit 450 without the help of the graphic processor 440. .

If the confirmed specification of the terminal 400 is greater than or equal to the reference specification, the rendering selector 360 may select hardware rendering.

Thereafter, the object image obtaining unit 370 of the input device 300 merges the unit images corresponding to the obtained plurality of frames through hardware rendering to obtain an image of the scan object (S4119).

If the confirmed specification of the terminal 400 is less than the reference specification, the rendering selector 360 of the input device 300 selects software rendering (S4121).

Thereafter, the object image obtaining unit 370 of the input device 300 merges the unit images corresponding to the obtained plurality of frames through software rendering to obtain an image of the scan target object (S4123).

As such, according to the control method of the input device 300 according to another embodiment of the present invention, the reference specification and the graphic processor 440 of the graphic processor 440 included in the terminal 400 connected to the input device 300. Depending on your Open GL version, you can choose between hardware rendering and software rendering.

Therefore, when the graphic processing unit of the terminal 400 cannot perform image processing or has insufficient performance in image processing, the image of the scan target object may be stably obtained by using software rendering instead of hardware rendering. That is, the image color of the scan object is selected by selecting hardware rendering and software rendering according to the reference specification of the graphic processor 440 included in the terminal 400 connected to the input device 300 and the Open GL version of the graphic processor 440. This falling problem can be solved, the speed of image merging processing can be increased, and the scanning range can be increased beyond the limited capacity of the graphic memory 443 included in the graphic processing unit 440.

Referring to FIG. 42 again, the use state checking unit 380 of the input device 300 checks the use state of the graphic processing unit 440 of the terminal 400 (S4125). The graphic processor 440 may include a graphic controller 441 and a graphic memory 443. The use state of the graphic processor 440 may be based on a current usage rate of the graphic controller 441 and a current usage rate of the graphic memory 443. Can mean. In an embodiment, the current usage rate of the graphics memory 443 may refer to the current usage capacity of the graphics memory 443 (or the current remaining capacity of the graphics memory).

In one embodiment, the use state checking unit 380 may check the use state of the graphic processor 440 in real time, or may check the use state of the graphic processor 440 at a predetermined time interval.

Thereafter, the controller 391 of the input device 300 checks whether the checked use state of the graphic processor 440 exceeds the reference use state (S4127). According to an embodiment, the reference usage state of the graphic processor 440 may mean a state in which the current use rate of the graphic control unit 441 is 40% and the current use rate of the graphic memory 443 is 50%, but this is merely an example. According to the setting, only one of the current usage rate of the graphic controller 441 is 40% or the current usage rate of the graphic memory 443 may be 50%.

If the checked use state of the graphic processor 440 does not exceed the reference use state, the input device 300 maintains step S4119 to obtain an image of the scanned image target object through hardware rendering. Specifically, referring to FIG. 44, when the current utilization rate of the graphic control unit 441 is 20% and the utilization rate of the graphic memory 443 is 30%, the input device 300 uses the graphic processing unit 440. By determining that the state does not exceed the reference usage state, an image of the scan target object may be obtained through hardware rendering.

That is, the input device 300 may determine that the graphic processor 440 is sufficient to process the image of the scan object, and may acquire the image of the scan object through hardware rendering.

Referring again to FIG. 42, if the confirmed use state of the graphic processor 440 exceeds the reference use state, the use state checking unit 380 may determine the use state of the central processing unit 450 of the terminal 400. Check (S4131). Specifically, referring to FIG. 44, when the utilization rate of the graphic control unit 441 is 50% and the utilization rate of the graphic memory 443 is 60%, the input device 300 uses the state of the graphic processing unit 440. Is determined to exceed the reference use state, the use state check unit 380 may check the current use rate of the central processing unit 450.

In an embodiment, the use state checking unit 380 may check the use state of the central processing unit 450 of the terminal 400 in real time, or may check the use state of the central processing unit 450 at a predetermined time interval.

Referring to FIG. 42 again, the control unit 391 of the input device 300 checks whether or not the usage rate of the central processing unit 450 is equal to or greater than the reference usage rate based on the confirmed use state of the central processing unit 450 (S4133).

If the current usage rate of the central processing unit 450 is greater than or equal to the reference usage rate, the input device 300 maintains operation S4119 to acquire an image of the scan target object through hardware rendering. In one embodiment, the reference usage rate of the central processing unit 450 may be 70%, but this is only an example.

Specifically, referring to FIG. 44, when the current utilization rate of the graphic controller 441 is 50%, the utilization rate of the graphic memory 443 is 60%, and the current utilization rate of the central processing unit 450 is 80% or more. The input device 300 may acquire an image of the scan target object by using hardware rendering. That is, even when the current use state of the graphic processor 440 exceeds the reference use state, the input device 300 uses the hardware rendering to perform the centralization when the use state of the central processing unit 450 exceeds the reference use state. It is possible to prevent the processor 450 from being overloaded.

Referring to FIG. 42 again, if the current utilization rate of the central processing unit 450 is less than the reference utilization rate, the input device 300 selects software rendering (S4121) and then acquires an image of the scan target object according to the software rendering. (S4123). Specifically, referring to FIG. 44, when the current utilization rate of the graphic controller 441 is 50%, the current utilization rate of the graphic memory 443 is 60%, and the current utilization rate of the central processing unit 450 is 30%. The input device 300 obtains an image of the scan target object through software rendering. That is, even if the specification of the graphic processor 440 satisfies the reference specification, the input device 300 uses the utilization rate of the central processing unit 450 when the state of use of the graphic processor 440 exceeds the reference use state. By checking the state, an image of the object to be scanned may be obtained through software rendering, thereby improving the computation speed of the image processing and improving the scan range according to the limited capacity of the graphic processor 440.

The control method of the input device according to the embodiment of the present invention described above may be stored in a computer-readable recording medium that is produced as a program to be executed in a computer, and examples of the computer-readable recording medium include ROM and RAM. CD-ROMs, magnetic tapes, floppy disks, optical data storage devices, and the like, and also include those implemented in the form of carrier waves (eg, transmission over the Internet).

The computer readable recording medium may be distributed over a networked computer system so that computer readable code can be stored and executed in a distributed manner. And, functional programs, codes and code segments for implementing the above method can be easily inferred by programmers of the technical field to which the present invention belongs.

In addition, although the preferred embodiment of the present invention has been shown and described above, the present invention is not limited to the above-described specific embodiment, the technical field to which the invention belongs without departing from the spirit of the invention claimed in the claims. Of course, various modifications can be made by those skilled in the art, and these modifications should not be individually understood from the technical spirit or prospect of the present invention.

Claims (15)

An input device having a scan function,
A unit image obtaining unit obtaining unit images of the scan target object;
A specification checking unit for checking a specification of a terminal communicating with the input device;
A rendering selector for selecting either hardware rendering or software rendering according to the identified specification; And
An object image acquisition unit configured to acquire the image of the scan target object by merging the obtained unit images according to the selected rendering;
Input device.
The method according to claim 1,
The specification confirmation unit,
Checking the specifications of the graphics processing unit included in the terminal
Input device.
3. The method of claim 2,
Further comprising a control unit for confirming whether the specification of the graphic processing unit is greater than or equal to the standard specification
Input device.
The method of claim 3,
The reference specification section
Checking whether the memory and the Open GL version of the graphics processor is greater than or equal to the reference value
Input device.
The method of claim 3,
The rendering selection unit,
In the case of more than the reference specification which is the specification of the checked graphic processor, the hardware rendering is selected.
If the identified graphic processor's specification is less than the reference specification, the software rendering is selected.
Input device.
The method according to claim 1,
Further comprising a use status check unit for checking the use status of the terminal
Input device.
The method according to claim 6,
The use state checking unit
Check the use state of the graphics processing unit and the central processing unit included in the terminal,
The rendering selector
Selecting either the hardware rendering or the software rendering according to whether the use state of the graphic processing unit and the central processing unit is equal to or greater than a reference use state;
Input device.
In the control method of the input device with a scanning function,
Obtaining unit images of the scan target object;
Checking a specification of a terminal communicating with the input device;
Selecting either hardware rendering or software rendering according to the identified specification; And
Acquiring an image of the scan target object by merging the obtained unit images according to the selected rendering;
Input device control method.
9. The method of claim 8,
Checking the specifications of the terminal
Determining whether a specification of the graphic processor included in the terminal is equal to or greater than a reference specification.
Input device control method.
10. The method of claim 9,
The step of selecting
If the specification of the graphic processor is greater than or equal to the reference specification, select the hardware rendering,
If the specification of the graphics processor is less than a reference specification, selecting the software rendering;
Input device control method.
9. The method of claim 8,
Checking the usage status of the terminal;
Input device control method.
12. The method of claim 11,
Checking the use state of the terminal
Checking the use state of the graphics processing unit and the central processing unit of the terminal;
Input device control method.
13. The method of claim 12,
The step of selecting
Selecting one of the hardware rendering or the software rendering according to whether a usage state of the graphic processing unit and the central processing unit is equal to or greater than a reference use state.
Input device control method.
14. The method of claim 13,
The step of selecting
Selecting the software rendering when the use state of the graphic processor is greater than or equal to a reference use state and the use state of the central processor is less than a reference use.
Input device control method.
14. The method of claim 13,
The step of selecting
If the use state of the graphic processor is greater than or equal to a reference use state and the use state of the central processing unit is greater than or equal to a reference use, selecting the hardware rendering;
Input device control method.


KR1020120105073A 2012-08-10 2012-09-21 Input device and controlling method there of KR20140038681A (en)

Priority Applications (4)

Application Number Priority Date Filing Date Title
KR1020120105073A KR20140038681A (en) 2012-09-21 2012-09-21 Input device and controlling method there of
US13/829,267 US9113053B2 (en) 2012-08-10 2013-03-14 Input apparatus and method for acquiring a scan image
JP2013054261A JP5827259B2 (en) 2012-08-10 2013-03-15 Input device and control method thereof
EP13001342.8A EP2696566A1 (en) 2012-08-10 2013-03-15 Handheld scanning apparatus and control method thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020120105073A KR20140038681A (en) 2012-09-21 2012-09-21 Input device and controlling method there of

Publications (1)

Publication Number Publication Date
KR20140038681A true KR20140038681A (en) 2014-03-31

Family

ID=50646861

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020120105073A KR20140038681A (en) 2012-08-10 2012-09-21 Input device and controlling method there of

Country Status (1)

Country Link
KR (1) KR20140038681A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017122920A1 (en) * 2016-01-13 2017-07-20 삼성전자 주식회사 Content display method and electronic device for performing same

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017122920A1 (en) * 2016-01-13 2017-07-20 삼성전자 주식회사 Content display method and electronic device for performing same
US10960295B2 (en) 2016-01-13 2021-03-30 Samsung Electronics Co., Ltd. Content display method and electronic device for performing same

Similar Documents

Publication Publication Date Title
JP6946188B2 (en) Methods and equipment for multi-technology depth map acquisition and fusion
KR102423175B1 (en) An apparatus for editing images using depth map and a method thereof
JP5109803B2 (en) Image processing apparatus, image processing method, and image processing program
EP3032815B1 (en) Scanning technology
KR102423295B1 (en) An apparatus for composing objects using depth map and a method thereof
JP2014071850A (en) Image processing apparatus, terminal device, image processing method, and program
JP5827259B2 (en) Input device and control method thereof
CN103034042A (en) Panoramic shooting method and device
JP4957327B2 (en) Display control device
KR101809750B1 (en) A Method for editting a scan image, display apparatus thereof
KR20140038681A (en) Input device and controlling method there of
CN114079726A (en) Shooting method and equipment
CN102843480B (en) Scanning technology
KR20140027675A (en) Input device and controlling method there of
US20160125629A1 (en) Divided Electronic Image Transmission System and Method
KR20140021337A (en) Input device and scan image acquiring method there of
KR20220005283A (en) Electronic device for image improvement and camera operation method of the electronic device
KR20170065160A (en) Projector and method for operating thereof
KR20140014881A (en) Input device and information transmitting method there of
EP2816794B1 (en) Image processing device and image processing method
US20160125850A1 (en) Networked Divided Electronic Image Messaging System and Method
JP2019041188A (en) Image processing apparatus, imaging apparatus, control method of image processing apparatus, and program
CN108293107A (en) Display processing unit, display processing method and the computer-readable medium for executing display processing method
JP2018006803A (en) Imaging apparatus, control method for imaging apparatus, and program
KR101827763B1 (en) A Method for displaying a scan image, display apparatus thereof and a method for pointing a scan area, input apparatus thereof

Legal Events

Date Code Title Description
A201 Request for examination
E902 Notification of reason for refusal
E601 Decision to refuse application