KR20100011336A - Information processing apparatus and method for moving image thereof - Google Patents

Information processing apparatus and method for moving image thereof Download PDF

Info

Publication number
KR20100011336A
KR20100011336A KR1020080072496A KR20080072496A KR20100011336A KR 20100011336 A KR20100011336 A KR 20100011336A KR 1020080072496 A KR1020080072496 A KR 1020080072496A KR 20080072496 A KR20080072496 A KR 20080072496A KR 20100011336 A KR20100011336 A KR 20100011336A
Authority
KR
South Korea
Prior art keywords
touch
input
drag
image
drag operation
Prior art date
Application number
KR1020080072496A
Other languages
Korean (ko)
Inventor
곽중영
Original Assignee
한국단자공업 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 한국단자공업 주식회사 filed Critical 한국단자공업 주식회사
Priority to KR1020080072496A priority Critical patent/KR20100011336A/en
Publication of KR20100011336A publication Critical patent/KR20100011336A/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1626Constructional details or arrangements for portable computers with a single-body enclosure integrating a flat display, e.g. Personal Digital Assistants [PDAs]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1637Details related to the display arrangement, including those related to the mounting of the display in the housing
    • G06F1/1643Details related to the display arrangement, including those related to the mounting of the display in the housing the display being associated to a digitizer, e.g. laptops that can be used as penpads
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0486Drag-and-drop
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • Position Input By Displaying (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The present invention relates to an information processing apparatus capable of easily moving an image to a desired position while minimizing input of a touch operation in an information processing apparatus having a touch screen, and a method of moving the screen thereof. The present invention includes a display unit 40 for outputting an image and a touch sensing unit 30 for sensing an input by a user's touch operation. When a drag operation is input to the touch sensing unit 30, the display unit according to the direction of the drag operation until a separate touch operation is input to the touch sensing unit 30 even after the drag operation is completed. And a controller 10 for continuously moving the image output to 40. According to the present invention, there is an advantage that the convenience of use of the information processing device is increased by easily moving the image to a desired position while minimizing the number or time of the user's touch operation.

Description

Information processing device and its screen moving method {INFORMATION PROCESSING APPARATUS AND METHOD FOR MOVING IMAGE THEREOF}

The present invention relates to an information processing apparatus and a screen moving method, and more particularly, an information processing apparatus having a touch screen, when outputting an image larger than a screen size, minimizes an input of a touch operation and moves the image to a desired position. The present invention relates to an information processing apparatus capable of easily moving a device and a screen moving method thereof.

With the recent increase in the popularity of portable terminals, the provision of contents through the portable terminals is increasing. However, since the size of the portable terminal is minimized for the portability of the terminal, the display device provided in the portable terminal also has a limitation in size.

As such, various types of contents are provided using a small display device, and the importance of a convenient screen scrolling function is increasing. For convenient screen scrolling, the image should be moved to the correct position while minimizing the number or time of user input.

In particular, since the map image displayed on the navigation device is very large compared to other contents, the screen scrolling function is required in a more convenient manner.

However, in a navigation device that supports the movement of a map image by a touch input method, conventionally, when a user inputs a drag operation on a touch screen, the image is moved only by a distance corresponding to the distance at which the drag operation is performed. In order to do this, the drag operation must be repeated numerous times.

Alternatively, when the screen scroll button is displayed on the touch screen, a number of times or a time for the user to perform a touch operation, such as continuously touching the button or maintaining a touch state on the button, has been required.

Accordingly, the present invention has been made to solve the above-mentioned conventional problems, an object of the present invention is provided with a touch screen that can easily move the image to a desired position while minimizing the number or time of the user's touch operation. An information processing apparatus and a screen moving method using the same are provided.

According to a feature of the present invention for achieving the above object, the present invention includes a display unit for outputting an image; A touch input unit provided on an upper surface of the display unit to sense an input by a user's touch operation; When the drag operation is input to the touch input unit, even after the input of the drag operation is completed, the image output to the display unit continuously moves along the direction of the drag operation until a separate touch operation is input to the touch input unit. It is configured to include a control unit.

Here, the control unit calculates the direction of the drag operation using the coordinates of the start point and the end point of the drag operation, and calculates using the distance between the start point and the end point of the drag operation and the input time of the drag operation. The image may be moved at a speed proportional to the speed of the drag operation.

The controller may recognize the touch operation as a drag operation when the distance between the start point and the end point of the touch action is greater than or equal to a set distance.

In this case, the separate touch operation may be any type of touch operation first detected by the touch input unit after the input of the drag operation is completed, or first detected by the touch input unit after the input of the drag operation is completed. It may mean that the touch operation is not a drag operation.

The controller may move the image according to a last drag operation when a plurality of drag operations are continuously input to the touch input unit.

On the other hand, the present invention, (A) the first touch operation is detected; (B) determining whether the first touch operation is a drag operation; (C) moving the image displayed on the screen according to an input direction of the first touch gesture when the first touch gesture is a drag gesture; (D) detecting a second touch motion; And (E) stopping the movement of the image.

The step (B) may include: (B1) recognizing coordinates of a start point and an end point of the first touch operation; (B2) calculating a distance between the start point and the end point; And (B3) determining the first touch operation as a drag operation when the distance is greater than or equal to the set distance.

Also, the step (C) may include: (C1) calculating a vector between coordinates of a start point and an end point of the first touch operation; (C2) moving the image according to the direction of the vector; or (C3) determining an input direction by calculating a vector between coordinates of a start point and an end point of the first touch operation; ; (C4) calculating an input speed by using an input time and a distance between a start point and an end point of the first touch operation; And (C5) moving the image at a speed proportional to the input speed in the input direction of the first touch operation.

The second touch operation of step (D) may be a touch operation that is first detected after the first touch operation is completed, or a touch operation that is not a drag operation that is initially sensed after the first touch operation is completed.

In this case, the drag operation may mean a touch operation in which a distance between a start point and an end point of the touch action is greater than or equal to a set distance.

As described in detail above, according to the information processing apparatus and the screen moving method according to the present invention, the following effects can be expected.

That is, there is an advantage in that the convenience of use of the information processing device is increased by easily moving an image to a desired position while minimizing the number or time of the user's touch operation.

Hereinafter, an information processing apparatus and a screen moving method thereof according to a specific embodiment of the present invention will be described in detail.

1 is an exemplary view showing a method of inputting a screen movement signal according to a specific embodiment of the present invention, Figure 2 is a block diagram showing a schematic configuration of an information processing apparatus according to a specific embodiment of the present invention, 3 is a flowchart illustrating a screen moving method according to a specific embodiment of the present invention step by step.

As shown in FIG. 1, in an information processing device such as a navigation device that inputs a user command by a touch screen, when the size of the output image is larger than the provided screen size, the user may drag on the touch screen. To move the image.

In this case, unlike the conventional method in which the image moves only by a distance corresponding to the input drag operation, in the present invention, when the user inputs a drag operation toward the right side of the information processing apparatus as shown in FIG. Even after the input of the drag operation is completed, that is, the drag operation is continuously moved in the same direction even when the drag operation is input and released. When the user's new touch input is detected again, the movement of the image is terminated.

An information processing apparatus supporting a screen moving method by such a method includes a controller 10 as shown in FIG. 2. The control unit 10 may be a control unit for controlling the overall information processing device.

The information processing apparatus includes a storage unit 20. The storage unit 20 stores various contents to be output. Text data, image data, program data, and the like are stored, and the controller 10 reads from the storage unit 20 to output such data.

Meanwhile, the information processing device includes a touch input unit 30. The touch input unit 30 is a human interface that detects a user's touch operation and receives it as an input command.

When the user's touch motion is detected by the touch input unit 30, the motion is converted into a signal and transmitted to the controller 10. The controller 10 receives this and performs a command.

In addition, the information processing apparatus is provided with a display unit 40, which is an image forming unit for processing and outputting data according to the command and the processing result of the control unit 10. In addition, the touch input unit 30 may be provided in parallel with the upper surface of the display unit 40 so that the content output to the display unit 40 may be converted in correspondence with the content input from the touch input unit 30. In addition, the display unit 40 may display an area for receiving a command distinguished from the touch input unit 30, thereby facilitating a user's touch operation input.

The control unit 10 reads data from the storage unit 20 according to a command input to the touch input unit 30, and outputs the result to the display unit 40. In this case, as in the case where the navigation device outputs a map image, when the image read from the storage unit 20 and output to the display unit 40 is larger than the size of the display unit 40, the user touches the touch. The directional drag operation is input to the input unit 30 to move the corresponding image.

At this time, the controller 10 determines whether the input touch operation is a drag operation when the user inputs a touch operation to an area within the image output by the touch input unit 30. Here, as an example of a criterion for determining whether or not the dragging operation is performed by the controller 10, the dragging operation when the distance between the starting point and the ending point of the touching operation is greater than or equal to a predetermined set distance may be identified. Alternatively, the touch operation may be identified as a drag operation when the input time is more than the preset time.

On the other hand, the controller 10 receives the touch operation as described above, and if it is determined that the input touch operation is the drag operation, the controller 10 moves the image displayed on the display unit 30 according to the input direction of the drag operation.

In particular, the controller 10 continues to move the image in the direction in which the drag operation is input even after the user's drag operation is completed.

In this case, the controller 10 may vector-operate coordinates between a start point and an end point to which the drag operation is input to determine an input direction of the drag operation, and determine a direction to move the image by using the vector operation. .

The controller 10 may move the image at a speed proportional to an input speed of the drag operation. The input speed of the drag operation is a distance between a start point and an end point of the drag operation and the drag operation. The time spent on this input can be used to determine the speed of image movement.

Meanwhile, while the controller 10 detects the drag operation and continues moving the image, the touch input unit 30 detects whether a new touch operation input after completion of the drag operation is input.

When a new touch operation is input by the touch input unit 30, the controller 10 stops moving the image on the display unit 40.

In this case, the new touch motion may be a touch motion of all types first detected after the input of the drag motion is completed. For example, when a separate drag operation is input again after completion of the input of the drag operation, the image movement may be stopped.

Alternatively, the new touch gesture may be a touch gesture instead of a drag gesture first detected after the drag gesture is completed. That is, while the drag operation is completed and the image is moved in response to the drag operation, if a simple touch operation other than the drag operation is newly inputted, the image movement is stopped and the image is moved while the image is being moved by the completion of the drag operation. When a new drag operation is input again on the output region, the image movement direction or speed may be changed in a direction or speed corresponding to the newly input drag operation.

That is, when a plurality of drag operations are continuously input to the touch input unit 30, the moving image is moved according to the direction of the last input drag operation, and the movement of the image is stopped only when a touch operation other than the drag operation is input. You may.

Meanwhile, when the screen moving method of the information processing apparatus according to a specific embodiment of the present invention is described step by step, as shown in FIG. 3, the touch input unit 30 corresponding to an area where an image is output to the display unit 40 is shown. The invention starts from the step in which the drag input is detected in the area (S100).

In step 100, the control unit 10 detects coordinates of a start point and an end point of the drag operation to determine whether the touch operation input to the touch input unit 30 is a drag operation, and between the two coordinates. If the distance is greater than or equal to a preset distance, the input touch motion may be recognized as a drag motion.

At this time, the control unit 10 reads the coordinates of the start point and the end point of the drag operation input in step 100 to move the image output to the display unit 40 (S200) between the two coordinates. The direction to move the image is determined by calculating a vector (S300). Here, the starting point is a point on the touch input unit 30 at which the drag operation is started, and the ending point is a point at which the touch is released.

In addition, the speed at which the image is moved is determined by calculating the speed at which the drag operation is input using the distance between the coordinates of the start and end points of the drag operation input and the time at which the drag operation is input (S400).

The control unit 10 moves the image output to the display unit 40 using the direction and the speed determined in steps 300 and 400 (S500). At this time, since the controller 10 does not calculate the distance to move the image in advance, the image does not move only by a specific distance, but moves constantly at the speed determined in step 400.

While the image is constantly moved in step 500, when a new touch operation is input to the touch input unit 30 (S600), the controller 10 may display the display unit at the moment when the new touch operation is input. The movement of the image is stopped in the state indicated in 40) (S700).

If the touch operation input in step 600 is a drag operation, and the image movement is not stopped by the drag operation, the image movement direction may be changed according to the input direction of the newly input drag operation. The speed of the input drag operation may also be calculated to change the speed of image movement.

As described above, when the image movement is not stopped by the dragging operation, the map movement may be stopped only when a simple touch operation is input.

The rights of the present invention are not limited to the embodiments described above, but are defined by the claims, and those skilled in the art can make various modifications and adaptations within the scope of the claims. It is self-evident.

1 is a diagram illustrating a method of inputting a screen movement signal according to a specific embodiment of the present invention.

2 is a block diagram showing a schematic configuration of an information processing apparatus according to a specific embodiment of the present invention.

3 is a flowchart illustrating a screen moving method according to a specific embodiment of the present invention.

** Description of the symbols for the main parts of the drawings **

10: control unit 20: storage unit

30: touch input unit 40: display unit

Claims (15)

A display unit for outputting an image; A touch input unit configured to sense an input by a user's touch operation; And When the drag operation is input to the touch input unit, the controller continuously moves the image output to the display unit according to the direction of the drag operation until a separate touch operation is input to the touch input unit even after the drag operation is completed. Information processing device, characterized in that configured to include. The method of claim 1, The control unit, And the direction of the drag operation is calculated using the coordinates of the start point and the end point of the drag operation. The method of claim 1, The control unit, And the image is moved at a speed proportional to the speed of the drag operation. The method of claim 3, The control unit, And calculating the speed of the drag operation by using the distance between the start and end points of the drag operation and the input time of the drag operation. The method of claim 1, The control unit. And a touch action is recognized as a drag action when the distance between the start point and the end point of the touch action is greater than or equal to a set distance. The method of claim 5, The separate touch operation, And after the drag operation is completed, all the touch motions first detected by the touch input unit. The method of claim 5, The separate touch operation, And after the input of the drag operation is completed, the first touch operation detected by the touch input unit, not the drag operation. The method of claim 7, wherein The control unit, And when a plurality of drag operations are continuously input to the touch input unit, moving the image according to a last drag operation. (A) detecting a first touch motion; (B) determining whether the first touch operation is a drag operation; (C) moving the image displayed on the screen according to an input direction of the first touch gesture when the first touch gesture is a drag gesture; (D) detecting a second touch motion; And And (E) stopping the movement of the image. The method of claim 9, Step (B) is, (B1) recognizing coordinates of a start point and an end point of the first touch operation; (B2) calculating a distance between the start point and the end point; And And (B3) if the distance is greater than or equal to the set distance, determining the first touch operation as a drag operation. The method of claim 9, Step (C) is, (C1) calculating a vector between coordinates of a start point and an end point of the first touch operation; (C2) moving the image according to the direction of the vector. The method of claim 9, Step (C) is, (C3) determining an input direction by calculating a vector between coordinates of a start point and an end point of the first touch operation; (C4) calculating an input speed by using an input time and a distance between a start point and an end point of the first touch operation; And (C5) moving the image to the input direction of the first touch operation at a speed proportional to the input speed. The method of claim 9, The second touch operation of the step (D), And a touch operation first detected after the first touch operation is completed. The method of claim 9, The second touch operation of the step (D), And a touch operation, not a drag operation first detected after the first touch operation is completed. The method of claim 14, The drag operation, A method of moving a screen of an information processing apparatus, wherein the distance between the start point and the end point of the touch operation is a touch operation equal to or greater than a set distance.
KR1020080072496A 2008-07-24 2008-07-24 Information processing apparatus and method for moving image thereof KR20100011336A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020080072496A KR20100011336A (en) 2008-07-24 2008-07-24 Information processing apparatus and method for moving image thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020080072496A KR20100011336A (en) 2008-07-24 2008-07-24 Information processing apparatus and method for moving image thereof

Publications (1)

Publication Number Publication Date
KR20100011336A true KR20100011336A (en) 2010-02-03

Family

ID=42085813

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020080072496A KR20100011336A (en) 2008-07-24 2008-07-24 Information processing apparatus and method for moving image thereof

Country Status (1)

Country Link
KR (1) KR20100011336A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101468970B1 (en) * 2012-11-30 2014-12-04 주식회사 인프라웨어 Method and apparatus for sliding objects across a touch-screen display
US9310991B2 (en) 2011-08-19 2016-04-12 Samsung Electronics Co., Ltd. Method and apparatus for navigating content on screen using pointing device
WO2016122096A1 (en) * 2015-01-28 2016-08-04 네이버 주식회사 Device and method for displaying comic book data

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9310991B2 (en) 2011-08-19 2016-04-12 Samsung Electronics Co., Ltd. Method and apparatus for navigating content on screen using pointing device
KR101468970B1 (en) * 2012-11-30 2014-12-04 주식회사 인프라웨어 Method and apparatus for sliding objects across a touch-screen display
WO2016122096A1 (en) * 2015-01-28 2016-08-04 네이버 주식회사 Device and method for displaying comic book data
KR20160092757A (en) * 2015-01-28 2016-08-05 네이버 주식회사 Apparatus and method for display cartoon data
US10635285B2 (en) 2015-01-28 2020-04-28 Naver Corporation Device and method for moving the display of cartoon data

Similar Documents

Publication Publication Date Title
US8847978B2 (en) Information processing apparatus, information processing method, and information processing program
EP2631767B1 (en) Method, computer readable medium and portable apparatus for scrolling a screen in a touch screen display apparatus
US9552071B2 (en) Information processing apparatus, information processing method and computer program
JP4734435B2 (en) Portable game device with touch panel display
US10318146B2 (en) Control area for a touch screen
US9575578B2 (en) Methods, devices, and computer readable storage device for touchscreen navigation
TW201531925A (en) Multi-touch virtual mouse
US20190220185A1 (en) Image measurement apparatus and computer readable medium
KR20120023867A (en) Mobile terminal having touch screen and method for displaying contents thereof
KR20100095951A (en) Portable electronic equipment and control method thereof
JP4879933B2 (en) Screen display device, screen display method and program
KR20100011336A (en) Information processing apparatus and method for moving image thereof
KR101294201B1 (en) Portable device and operating method thereof
KR101436585B1 (en) Method for providing user interface using one point touch, and apparatus therefor
JP6197559B2 (en) Object operation system, object operation control program, and object operation control method
JP2015153197A (en) Pointing position deciding system
KR20090020157A (en) Method for zooming in touchscreen and terminal using the same
JP5769841B2 (en) Portable game device with touch panel display
JP4925989B2 (en) Input device and computer program
KR101496017B1 (en) Touch screen controlling method in mobile device, and mobile device threof
JP6126639B2 (en) A portable game device having a touch panel display and a game program.
KR101474873B1 (en) Control device based on non-motion signal and motion signal, and device control method thereof
KR101136327B1 (en) A touch and cursor control method for portable terminal and portable terminal using the same
KR102049259B1 (en) Apparatus and method for controlling user interface based motion
JP5769765B2 (en) Portable game device with touch panel display

Legal Events

Date Code Title Description
A201 Request for examination
E902 Notification of reason for refusal
E601 Decision to refuse application