US20070019734A1 - Mobile communication terminal for estimating motion direction and method thereof - Google Patents
Mobile communication terminal for estimating motion direction and method thereof Download PDFInfo
- Publication number
- US20070019734A1 US20070019734A1 US11/491,147 US49114706A US2007019734A1 US 20070019734 A1 US20070019734 A1 US 20070019734A1 US 49114706 A US49114706 A US 49114706A US 2007019734 A1 US2007019734 A1 US 2007019734A1
- Authority
- US
- United States
- Prior art keywords
- blocks
- block
- motion direction
- terminal
- processor
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/0304—Detection arrangements using opto-electronic means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/223—Analysis of motion using block-matching
- G06T7/231—Analysis of motion using block-matching using full search
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04B—TRANSMISSION
- H04B1/00—Details of transmission systems, not covered by a single one of groups H04B3/00 - H04B13/00; Details of transmission systems not characterised by the medium used for transmission
- H04B1/38—Transceivers, i.e. devices in which transmitter and receiver form a structural unit and in which at least one part is used for functions of transmitting and receiving
- H04B1/40—Circuits
Definitions
- the present invention relates to a mobile communication terminal and method for estimating a motion direction of the terminal.
- a mobile communication terminal includes a keypad allowing a user to input data such as text or numerals.
- the keys on the keypad need to be manipulated one by one to execute a specific function or to input characters or numerals, the key inputting process is time consuming.
- one object of the present invention is to address the above-noted and other objects.
- Another object of the present invention is to provide a mobile communication terminal and method that can estimate a motion direction of the terminal using a plurality of input pictures taken by a camera on the terminal.
- Yet another object of the present invention is to display various types of information corresponding to the estimated motion direction.
- the present invention provides in one aspect a mobile communication terminal including a camera configured to take a plurality of pictures, a processor configured to estimate a motion direction of the terminal using at least some of the plurality of pictures, and a display configured to display information corresponding to the estimated motion direction.
- the present invention provides a method of estimating a motion direction in a mobile communication terminal, which includes taking a plurality of pictures with a camera on the terminal, estimating a motion direction of the terminal using at least some of the plurality of pictures and displaying information corresponding to the estimated motion direction.
- FIG. 1 is a block diagram of a mobile communication terminal for estimating a motion direction according to an embodiment of the present invention
- FIG. 2 is a flowchart illustrating an general method of estimating the motion direction of the terminal according to one embodiment of the present invention
- FIG. 3A is a flowchart illustrating a detailed method of estimating the motion direction of the terminal according to one embodiment of the present invention
- FIG. 3B is an overview illustrating how first and second pictures are arranged to estimate the motion direction of the terminal
- FIG. 3C is a flowchart illustrating a detailed method of estimating the motion direction of the terminal according to another embodiment of the present invention.
- FIG. 3D is a flowchart illustrating a detailed method of estimating the motion direction of the terminal according to yet another embodiment of the present invention.
- FIG. 4 is a block diagram illustrating a mobile communication terminal for estimating a motion direction according to another embodiment of the present invention.
- FIG. 5 is a flowchart illustrating a method of displaying character information corresponding to the estimated motion direction of the terminal according to an embodiment of the present invention
- FIG. 6A is a flowchart illustrating a method of using extracted character data to execute a function on the terminal according to an embodiment of the present invention
- FIG. 6B is a flowchart illustrating a method of using extracted character data to generate a call connection according to an embodiment of the present invention.
- FIG. 7 is a flowchart of a method of displaying pages of information based on the estimated motion direction according to an embodiment of the present invention.
- FIG. 1 is a block diagram of a mobile communication terminal for estimating a motion direction according to an embodiment of the present invention.
- the mobile communication terminal includes an input unit 110 , a camera 120 , a memory 130 , a processor 140 , and a display 150 .
- the processor 140 includes a picture segmenting unit 142 .
- FIG. 2 is a flowchart illustrating a method of estimating the motion direction of the terminal according to one embodiment of the present invention.
- FIG. 1 will also be referred to in this description.
- the input unit 110 generates a signal (hereinafter called ‘selection signal’) for selecting a function indicating specific information according to a motion direction of the terminal (S 210 ). That is, the user selects a key or keys on the input unit 110 to generate the selection signal for selecting the function indicating specific information according to the motion direction of the terminal.
- selection signal a signal for selecting a function indicating specific information according to a motion direction of the terminal.
- the camera 120 takes a plurality of pictures (S 220 ). That is, when the selection signal is generated, the camera 120 is set into an operation mode to automatically take a plurality of pictures.
- the pictures are then stored in the memory 130 (S 230 ).
- the processor 140 and picture segmenting unit 142 then process the plurality of pictures to determine a motion direction of the terminal (S 240 ).
- the information according to an estimated motion direction using the plurality of pictures is then displayed to the user on the display 150 (S 250 ).
- the function indicating specific information according to a motion direction may be a character display function, a page display function, a display function in which a partial picture indicating a motion direction of the terminal is displayed, etc.
- a specific key on the input unit 110 may be used to allow the user to select the function indicating specific information according to the motion direction of the terminal.
- the specific key may be a key that is assigned a plurality of specific functions including the motion direction function.
- the specific key may be a key separately provided for the function.
- the specific key may be pressed once or a set number of times. Alternatively, the specific key may be pressed for a particular length of time to generate the selection signal.
- the camera 120 when the selection signal is generated, the camera 120 is set into an operation mode to automatically take a plurality of pictures. Thus, the camera 120 sequentially takes a plurality of different pictures which correspond to movement of the terminal.
- the camera 120 takes a plurality of pictures gradually moving in the left direction.
- information can be displayed to the user indicating the terminal is moving in the left direction.
- the camera 120 preferably takes the plurality of the pictures at a constant rate.
- the constant rate may be 16 frames per second, 32 frames per second, etc.
- the frame is a unit picture among consecutive pictures.
- the plurality of available picture input rates may be displayed to the user to allow the user to select which frame rate he or she prefers. Then, the camera 120 takes the plurality of pictures consecutively at the selected frame rate.
- a default frame rate may also be set.
- the processor 140 periodically checks the storage capacity of the memory 130 . If the processor 140 determines the storage capacity of the memory 130 is equal to or smaller than a predetermined storage capacity, the processor 140 displays a warning to the user about the low storage capacity. For example, the processor 140 may display a warning message and a plurality of previously taken pictures that the user may select to be deleted. Then, the previously taken pictures selected by the use can be deleted to increase the storage capacity of the memory 130 .
- the picture segmenting unit 142 segments the taken pictures into a plurality of blocks.
- the processor 140 determines if any blocks match.
- the camera 120 may take a formerly taken picture and then a latterly taken picture.
- the picture segmenting unit 142 divides the formerly taken picture (first picture) and the latterly taken picture (second picture) into a predetermined number of blocks (e.g., 9 blocks for each picture).
- the processor 140 determines if any of the blocks from the formerly taken picture match any of the blocks in the latterly taken picture.
- the processor 140 matches together the two pictures at the matching blocks to determine whether the terminal has moved in a left, right, up or down direction. East, West, North and South directions may also be used. In more detail, the processor 140 arranges the formerly and latterly taken pictures over each other using the matching block or blocks as a reference. The processor 140 is then able to estimate the motion direction of the terminal as the ‘left direction’ if the latterly picture is arranged in a left direction against the formerly taken picture. In addition, rather than using first and second pictures, the plurality of blocks may be divided at a particular position and then the divided blocks may be compared.
- the processor 140 may also obtain the motion direction of the terminal by selecting a position of a reference block in the latterly taken picture and then marking a corresponding position in the formerly taken picture. In this instance, the processor 140 can obtain the moving direction of the terminal by tracing the reference block from the formerly taken picture to the latterly taken picture and choosing the motion direction as a direction opposite to the obtained moving direction.
- FIG. 3A is a flowchart illustrating a detailed method of determining the motion direction of the terminal according to one embodiment of the present invention.
- FIG. 1 will also be referred to in this description.
- the picture segmenting unit 142 divides the taken pictures into a plurality of blocks of equal size (S 312 ).
- each block includes a plurality of pixels that can used to see if any blocks match.
- the processor 140 uses a correlation process to see what blocks have a highest correlation with each other (S 316 ). That is, the processor 140 searches the plurality of the blocks configuring the formerly taken picture for a block having a highest correlation with a specific block of the latterly taken picture (or vice versa).
- the processor 140 calculates a pixel value for the plurality of pixels of each block in the latterly taken picture, and then searches for a block having a highest correlation among the blocks in the formerly taken picture.
- the pixel value for each block is automatically calculated by the processor 140 .
- the blocks having the most similar pixel value have the highest correlation.
- the processor 140 determines a block or blocks of the two pictures match (Yes in S 314 ), the processor 140 estimates the motion direction of the terminal using the matched block(s) as a reference. Similarly, after the processor 140 determines the blocks that have the highest correlation in step S 316 , the processor 140 estimates the motion direction of the terminal using the highest correlation blocks as a reference.
- FIG. 3B is an overview of formerly and latterly taken pictures, which are used to determine the motion direction of the terminal.
- each picture is divided into 9 blocks (i.e., 3*3 blocks).
- the motion direction may be represented as an ‘upward direction’, ‘downward direction’, ‘left direction’, ‘right direction’ or any combination thereof.
- the motion direction may be represented as an ‘east direction’, ‘west direction’, ‘north direction’, ‘south direction’ or any combination thereof.
- the processor 140 arranges or overlays the formerly and latterly taken pictures with reference to the respectively matched blocks (see the bottom portion of FIG. 3B ).
- the processor 140 estimates the motion direction of the terminal as being in the ‘right direction’ or ‘east direction’, because the latterly taken picture (second picture) is located in the ‘right direction’ or ‘east direction’ against the formerly taken picture (first picture) with reference to the matched blocks.
- the processor 140 arranges the formerly and latterly taken pictures with reference to the respectively matched blocks and then estimates the motion direction as the ‘downward direction’ or ‘south direction’, because the latterly taken picture is downward or south of the formerly taken picture.
- the processor 140 arranges the formerly and latterly taken pictures with reference to the matched block and then estimates the motion direction as the ‘southeast direction’, because the latterly taken picture is southeast of the formerly taken picture.
- the processor 140 can also estimate a motion direction of the terminal by calculating respectively a motion vector value of blocks included in each of the two consecutively taken pictures. For instance, if two consecutively taken pictures are segmented into 8*8 blocks, and the block ( 1 , 1 ) of the formerly taken picture matches the block ( 4 , 5 ) of the latterly taken picture, the processor 140 calculates a motion vector value by eliminating a vector value of the block ( 1 , 1 ) from a vector value of the block ( 4 , 5 ) using the latterly taken picture as a reference picture. Then, the processor 140 estimates the motion direction of the terminal using the calculated motion vector value.
- the processor 140 may also use blocks that have a highest correlation among each other. For example, when a block ( 2 , 2 ) in the latterly taken picture has a highest correlation with a block ( 3 , 2 ) in the formerly taken picture, the processor 140 arranges the formerly and latterly taken pictures with reference to the blocks having the highest correlation and then estimates the motion direction as the ‘right direction’ or ‘east direction’ (i.e., the latterly taken picture is right or east of the formerly taken picture).
- the processor 140 arranges the formerly and latterly taken pictures with reference to the block having the highest correlation and then estimates the motion direction as ‘southeast direction’.
- FIG. 3C is a flowchart of a detailed method of estimating a motion direction of the terminal according to another embodiment of the present invention.
- the picture segmenting unit 142 segments the taken picture into a plurality of blocks (S 321 ) (similar to the step S 312 in FIG. 3A ).
- the processor 140 selects two consecutively taken pictures from a plurality of taken pictures and sets a reference block in a formerly taken picture of the two pictures. That is, the formerly taken picture corresponds to a picture taken before the latterly taken picture.
- the processor 140 selects a random block among the divided blocks of the formerly taken picture as a reference block (S 323 ), and searches the latterly taken picture to determine if the same reference block exists therein (S 325 ). In addition, the processor 140 preferably selects the block located at a center among the plurality of blocks as the reference block.
- the processor 140 decides a presence or non-presence of the reference block in the latterly taken picture using a block matching algorithm.
- the block matching algorithm is a method of segmenting a picture into blocks equal in size and representing all pixels within each of the blocks as a motion vector. Then, the block matching algorithm finds a block most similar to a block of the latterly taken picture by moving blocks of a formerly taken picture by one pixel to find a motion vector.
- a mean absolute error, a mean squared error, etc. may be used as a matching reference.
- the processor 140 which previously calculated and stored an overall pixel value for the reference block, calculates an overall pixel value for each block of the latterly taken picture and compares these pixels values to determine if the reference block in the formerly taken picture also exists in the latterly taken picture.
- the processor 140 determines the reference block does not exist (No in S 325 )
- the processor 140 searches the blocks in the latterly taken picture for a block having a highest correlation with the reference block (S 327 ).
- the processor 140 searches the block having the highest correlation with the reference block using the block matching algorithm.
- the processor 140 then estimates the motion direction of the terminal with reference to a position of the reference block having the highest correlation (S 329 ).
- the processor 140 determines the reference block does exist (Yes in S 325 )
- the processor 140 estimates the motion direction of the terminal with reference to a position of the reference block (S 329 ).
- the motion direction is estimated in a similar manner as that discussed above with respect to FIGS. 3A and 3B .
- the processor 140 decides that a position of the reference block has moved to ( 1 , 3 ) from ( 2 , 2 ).
- the processor 140 estimates the motion direction to be in a direction opposite to the moving direction of the reference block.
- the processor 140 decides that a position of the reference block has moved to ( 3 , 2 ) from ( 2 , 2 ). The processor 140 then estimates the motion direction of the terminal as being opposite to the moving direction of the reference block.
- a similar process is used for blocks having a highest correlation among each other. For instance, when the processor 140 sets the block ( 2 , 2 ) in the formerly taken picture as the reference block and determines the block ( 1 , 3 ) in the latterly taken picture has the highest correlation with the reference block, the processor 140 decides that a position of the reference block has moved to ( 1 , 3 ) from ( 2 , 2 ). The processor 140 then estimates a direction opposite to the moving direction of the reference block as the motion direction.
- the processor 140 when the processor 140 sets the block ( 2 , 2 ) in the formerly taken picture as the reference block and determines the block ( 3 , 2 ) in the latterly taken picture has the highest correlation with the reference block, the processor 140 decides that a position of the reference block has moved to ( 3 , 2 ) from ( 2 , 2 ). The processor 140 then estimates a direction opposite to the moving direction of the reference block as the motion direction.
- FIG. 3D is a flowchart of a method of estimating a motion of the terminal according to yet another embodiment of the present invention.
- FIG. 1 will also be referred to in this description.
- the picture segmenting unit 142 segments the taken picture into a plurality of blocks (S 332 ).
- the processor 140 determines whether or not there exists a block that is common (hereinafter called a ‘common block’) included in two consecutively taken pictures (S 334 ). For instance, the processor 140 calculates and stores a pixel value for each block in the formerly taken picture and in the latterly taken picture of the two consecutively taken pictures. Then, the processor 140 compares the calculated pixel values of the formerly taken picture with the calculated pixel values of the latterly taken picture to determine if the common block exists in the formerly and latterly taken pictures.
- a ‘common block’ a block that is common included in two consecutively taken pictures
- the processor 140 determines the common block exists, the processor 140 searches for a position of the common block in each of the formerly and latterly taken pictures (S 336 ). The processor 140 then calculates a motion vector value of the searched common block (S 338 ).
- the processor 140 can recognize a position value of each of the 9 blocks as an (x, y) coordinate value on an X-Y coordinate plane. In particular, the processor 140 determines a position value ( 1 , 1 ) of a random block as a coordinate value ( 1 , 1 ) on the X-Y coordinate plane.
- the processor 140 calculates a motion vector value with reference to the common blocks by eliminating a vector value ( 1 , 3 ) of the block ( 1 , 3 ) of the formerly taken picture from a vector value ( 3 , 1 ) of the block ( 3 , 1 ) of the latterly taken picture.
- the processor 140 determines a plurality of common blocks exist, the processor 140 calculates all motion vector values for the respective common blocks.
- the processor 140 determines no common blocks exist (No in S 334 )
- the processor 140 searches the blocks of the formerly and latterly taken pictures for blocks having the highest correlation among each other (S 340 ). That is, similar to the embodiment discussed above, the processor 140 recognizes blocks having the most similar pixel value as the blocks having the highest correlation.
- the processor 140 calculates a motion vector value of the searched block having the highest correlation (S 342 ). Note the step 342 is similar to the process discussed in step S 338 . Next, the processor 140 estimates the motion direction of the terminal with reference to the calculated motion vector value (S 344 ).
- the processor 140 estimates the motion direction by considering a size and direction of the calculated motion vector value. Further, when there are a plurality of common blocks and thus a motion vector value for each common block, the processor 140 calculates an average of the plurality of the motion vector values and estimates the motion direction by considering a size and direction of the calculated average motion vector value.
- FIG. 4 is a block diagram of a mobile communication terminal for estimating a motion direction of the terminal according to another embodiment of the present invention.
- the terminal includes the input unit 110 , the camera 120 , the memory 130 , the processor 140 , the picture segmenting unit 410 and the display 150 similar to that as shown in FIG. 1 .
- the terminal in FIG. 4 also includes a call connecting unit 440
- the processor 140 further includes a character extracting unit 420 and a drive unit 430 .
- FIG. 5 is a flowchart illustrating a method of extracting and displaying information corresponding to the estimated motion direction of the terminal.
- FIG. 4 will also be referred to in this description.
- character data corresponding to a motion direction of the terminal is stored in the memory 130 (S 510 ). That is, the memory 130 stores all character information, languages, symbols, numerals, etc., supported and provided by the terminal.
- the memory 130 stores the character information, languages, symbols, numerals, etc. in a format that corresponds to motion directions.
- the memory 130 may store character data such as ‘ ’ indicating a ‘right and downward’ direction, ‘ ’ indicating a ‘downward and right’ direction, ‘ ’ indicating a ‘right, downward and right’ direction, etc.
- the memory 130 may store character data such as ‘ ’ indicating a ‘downward, upward and right’ direction, ‘ ⁇ ’ indicating a ‘downward, left and right’ direction, etc.
- the user can select a function indicating specific information according to the motion direction of the terminal by selecting a key or keys on the input unit 110 (S 520 ).
- the camera 120 then takes a plurality of pictures (S 530 ), the pictures are stored in the memory 130 , and the processor 140 estimates the motion direction of the terminal using the plurality of the taken pictures (S 550 ) in a similar manner as discussed above.
- the character extracting unit 420 extracts character data corresponding to the estimated motion direction from the character data stored in the memory 130 (S 560 ).
- the character data accurately indicates the estimated motion direction. For instance, when the processor estimates the motion direction of the terminal as being a ‘downward, downward, downward, right and right’ direction, the character extracting unit 420 recognizes the motion direction as ‘downward and right’ and then extracts ‘ ’ from the memory 130 .
- the character extracting unit 420 recognizes the motion direction as ‘right and downward’ and then extracts ‘ ’ from the memory 130 . Also, when the processor 140 estimates the motion direction as being a ‘downward, downward, downward, upward, upward, right and right’, the character extracting unit 420 recognizes the motion direction as ‘downward, upward and right’ and then extracts ‘ ’ from the memory 130 .
- the character extracting unit 420 recognizes the motion direction as ‘downward, left and right’ and then extracts ‘ ⁇ ’ from the memory 130 .
- the display 150 displays the extracted character on the display 150 (S 570 ). For instance, if the extracted character data are ‘ ’ and ‘ ’, the display 150 displays ‘ ’. If the extracted character data are ‘ ’ and ‘ ⁇ ’, the display 150 displays ‘ ’. Other symbols such as arrows, etc. may be used to display the estimated direction of the terminal.
- FIG. 6A is a flowchart illustrating a method of using extracted character data according to an embodiment of the present invention.
- FIG. 4 will also be referred to in this description.
- the processor 140 determines whether or not the extracted character data indicates a prescribed function set in the terminal (S 612 ).
- the processor 140 determines that the prescribed function is indicated (Yes in S 612 ). For instance, if the extracted character data is ‘message’, the processor 140 determines a message function set in the terminal is to be executed.
- the processor 140 determines the message function is to be executed.
- the input unit 110 generates an execution signal for executing the prescribed function (S 614 ).
- the input unit 110 generates an execution signal according to a user's selection. However, if a generation of the execution signal is set to ‘default’, the input unit 110 automatically generates the execution signal.
- the drive unit 430 then executes the prescribed function according to the execution signal (S 616 ). That is, the drive unit 430 includes drive programs for all functions set in the terminal so as to execute the prescribed function.
- the processor 140 determines whether or not the extracted character data indicates a contact party stored in the terminal (S 622 ).
- the input unit 110 If the extracted character data indicates the name of the contact party terminal or a number corresponding to the prescribed name (Yes in S 622 ), the input unit 110 generates a call connection signal to the contact party corresponding to the prescribed name (S 624 ).
- the input unit 110 generates the call connection signal according to a user's selection. However, if the generation of the call connection signal is set to ‘default’, the input unit 110 automatically generates the call connection signal. Then, the call connecting unit 440 connects a call to the contact party corresponding to the name (S 626 ).
- FIG. 7 is a flowchart of a method of displaying pages of information corresponding to the estimate motion direction of the terminal according to still another embodiment of the present invention.
- FIG. 1 will also be referred to in this description.
- Steps S 710 , S 720 , S 730 , and S 740 are the same as the corresponding steps S 210 , S 220 , S 230 and S 240 in FIG. 2 . Accordingly, a detailed explanation of these steps in FIG. 7 will not be repeated.
- step S 750 the processor 140 determines whether or not the display 150 is currently displaying only partial information out of a total amount of information (S 750 ).
- the total amount of information may be a plurality of pages and the partial information may be a specific one of the plurality of the pages.
- the processor 140 determines that the partial information is currently being displayed.
- the total amount of information may be a whole picture and the partial information may be a portion of the whole picture. For instance, if the display 150 displays a specific area on a whole country map or a specific part of a whole photograph picture, the processor 140 determines that partial information is currently being displayed.
- the total amount of information may be information for a plurality of functions set in the terminal and the partial information may be information for a specific one of the plurality of the functions. For instance, if the display 150 currently displays information for a ‘message function’ from among a plurality of functions set in the terminal, the processor 140 determines that partial information is currently being displayed. Further, the information for the message function includes a received message list, a sent message list, specific message content, etc.
- the display 150 displays another partial information corresponding to the estimated motion direction. For example, if the estimated motion direction is an ‘upward direction’, the display 150 displays a front page of a currently displayed page.
- the display 150 displays a last page of a plurality of pages by moving away from a currently displayed page.
- the page moving direction corresponding to the estimated motion direction can be previously set in the mobile terminal or can be set by a user.
- the processor 140 controls the display 150 to display another page moved from a currently displayed page. For example, if the estimated motion direction is an ‘upward direction’, the display 150 displays an area away from a specific area currently displayed on a whole country map in the ‘upward direction’.
- the display 150 displays an area moved from a specific area currently displayed on the whole country map.
- the processor 140 controls another picture moved from a currently displayed partial picture for the whole picture to be displayed.
- the display 150 stops displaying information for a currently displayed function and then displays information for a function corresponding to the ‘upward direction’. That is, if a motion direction is estimated, the processor 140 , in which a moving direction of a function for the estimated motion direction is previously set, controls the display 150 to display information for the moved function according to the estimated motion direction.
- the present invention provides the following effects or advantages.
- the motion direction of the terminal can be estimated using a plurality of pictures taken via a photographing device provided with the terminal. Secondly, because the motion direction is estimated, a specific character corresponding to the estimated motion direction can also be displayed.
- the present invention advantageously estimates the motion direction of the terminal and displays information according to the estimated motion direction (and thus the user can see the information according to the estimated motion direction), estimates the motion direction and activates a specific function on the terminal based on the estimated motion direction of the terminal (and thus the user need only move the terminal in a certain direction to activate a specific function), and estimates the motion direction and displays different pages of information based the motion direction of the terminal (and thus the user can page through a plurality of pages by moving the phone in a certain direction).
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Human Computer Interaction (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Computer Networks & Wireless Communication (AREA)
- Signal Processing (AREA)
- Image Analysis (AREA)
- Image Processing (AREA)
- Telephone Function (AREA)
- Position Input By Displaying (AREA)
Abstract
A mobile communication terminal including a camera configured to take a plurality of pictures, a processor configured to estimate a motion direction of the terminal using at least some of the plurality of pictures, and a display configured to display information corresponding to the estimated motion direction.
Description
- This application claims the benefit of Korean Patent Application No. 10-2005-0067204, filed on Jul. 25, 2005, which is hereby incorporated by reference in its entirety.
- 1. Field of the Invention
- The present invention relates to a mobile communication terminal and method for estimating a motion direction of the terminal.
- 2. Discussion of the Related Art
- A mobile communication terminal according to a related art includes a keypad allowing a user to input data such as text or numerals. However, because the keys on the keypad need to be manipulated one by one to execute a specific function or to input characters or numerals, the key inputting process is time consuming.
- Moreover, when a relatively large amount of information is input or displayed, the user must manually page through parts of the information to view the entire contents. Thus, the inputting process on the terminal is time consuming and cumbersome.
- Accordingly, one object of the present invention is to address the above-noted and other objects.
- Another object of the present invention is to provide a mobile communication terminal and method that can estimate a motion direction of the terminal using a plurality of input pictures taken by a camera on the terminal.
- Yet another object of the present invention is to display various types of information corresponding to the estimated motion direction.
- To achieve these objects and other advantages and in accordance with the purpose of the invention, as embodied and broadly described herein, the present invention provides in one aspect a mobile communication terminal including a camera configured to take a plurality of pictures, a processor configured to estimate a motion direction of the terminal using at least some of the plurality of pictures, and a display configured to display information corresponding to the estimated motion direction.
- In another aspect, the present invention provides a method of estimating a motion direction in a mobile communication terminal, which includes taking a plurality of pictures with a camera on the terminal, estimating a motion direction of the terminal using at least some of the plurality of pictures and displaying information corresponding to the estimated motion direction.
- Further scope of applicability of the present invention will become apparent from the detailed description given hereinafter. However, it should be understood that the detailed description and specific examples, while indicating preferred embodiments of the invention, are given by illustration only, since various changes and modifications within the spirit and scope of the invention will become apparent to those skilled in the art from this detailed description.
- The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this application, illustrate embodiments of the invention and together with the description serve to explain the principle of the invention. In the drawings:
-
FIG. 1 is a block diagram of a mobile communication terminal for estimating a motion direction according to an embodiment of the present invention; -
FIG. 2 is a flowchart illustrating an general method of estimating the motion direction of the terminal according to one embodiment of the present invention; -
FIG. 3A is a flowchart illustrating a detailed method of estimating the motion direction of the terminal according to one embodiment of the present invention; -
FIG. 3B is an overview illustrating how first and second pictures are arranged to estimate the motion direction of the terminal; -
FIG. 3C is a flowchart illustrating a detailed method of estimating the motion direction of the terminal according to another embodiment of the present invention; -
FIG. 3D is a flowchart illustrating a detailed method of estimating the motion direction of the terminal according to yet another embodiment of the present invention; -
FIG. 4 is a block diagram illustrating a mobile communication terminal for estimating a motion direction according to another embodiment of the present invention; -
FIG. 5 is a flowchart illustrating a method of displaying character information corresponding to the estimated motion direction of the terminal according to an embodiment of the present invention; -
FIG. 6A is a flowchart illustrating a method of using extracted character data to execute a function on the terminal according to an embodiment of the present invention; -
FIG. 6B is a flowchart illustrating a method of using extracted character data to generate a call connection according to an embodiment of the present invention; and -
FIG. 7 is a flowchart of a method of displaying pages of information based on the estimated motion direction according to an embodiment of the present invention. - Reference will now be made in detail to the preferred embodiments of the present invention, examples of which are illustrated in the accompanying drawings. Wherever possible, the same reference numbers will be used throughout the drawings to refer to the same or like parts.
- Turning first to
FIG. 1 , which is a block diagram of a mobile communication terminal for estimating a motion direction according to an embodiment of the present invention. As shown, the mobile communication terminal includes aninput unit 110, acamera 120, amemory 130, aprocessor 140, and adisplay 150. Further, theprocessor 140 includes apicture segmenting unit 142. - Next,
FIG. 2 is a flowchart illustrating a method of estimating the motion direction of the terminal according to one embodiment of the present invention.FIG. 1 will also be referred to in this description. As shown inFIG. 2 , theinput unit 110 generates a signal (hereinafter called ‘selection signal’) for selecting a function indicating specific information according to a motion direction of the terminal (S210). That is, the user selects a key or keys on theinput unit 110 to generate the selection signal for selecting the function indicating specific information according to the motion direction of the terminal. - After the user selects the motion direction function, the
camera 120 takes a plurality of pictures (S220). That is, when the selection signal is generated, thecamera 120 is set into an operation mode to automatically take a plurality of pictures. - The pictures are then stored in the memory 130 (S230). The
processor 140 andpicture segmenting unit 142 then process the plurality of pictures to determine a motion direction of the terminal (S240). The information according to an estimated motion direction using the plurality of pictures is then displayed to the user on the display 150 (S250). - In addition, the function indicating specific information according to a motion direction may be a character display function, a page display function, a display function in which a partial picture indicating a motion direction of the terminal is displayed, etc.
- Further, a specific key on the
input unit 110 may be used to allow the user to select the function indicating specific information according to the motion direction of the terminal. For example, the specific key may be a key that is assigned a plurality of specific functions including the motion direction function. - Alternatively, the specific key may be a key separately provided for the function. In addition, to generate the selection signal, the specific key may be pressed once or a set number of times. Alternatively, the specific key may be pressed for a particular length of time to generate the selection signal.
- Further, as discussed above, when the selection signal is generated, the
camera 120 is set into an operation mode to automatically take a plurality of pictures. Thus, thecamera 120 sequentially takes a plurality of different pictures which correspond to movement of the terminal. - For instance, when the terminal moves in a left direction, the
camera 120 takes a plurality of pictures gradually moving in the left direction. Thus, when the pictures are combined together in a time sequence, information can be displayed to the user indicating the terminal is moving in the left direction. - In addition, the
camera 120 preferably takes the plurality of the pictures at a constant rate. For example, the constant rate may be 16 frames per second, 32 frames per second, etc. Further, the frame is a unit picture among consecutive pictures. Also, the plurality of available picture input rates may be displayed to the user to allow the user to select which frame rate he or she prefers. Then, thecamera 120 takes the plurality of pictures consecutively at the selected frame rate. A default frame rate may also be set. - In addition, because the space available in the
memory 130 may be limited, theprocessor 140 periodically checks the storage capacity of thememory 130. If theprocessor 140 determines the storage capacity of thememory 130 is equal to or smaller than a predetermined storage capacity, theprocessor 140 displays a warning to the user about the low storage capacity. For example, theprocessor 140 may display a warning message and a plurality of previously taken pictures that the user may select to be deleted. Then, the previously taken pictures selected by the use can be deleted to increase the storage capacity of thememory 130. - Further, when estimating the motion direction of the terminal, the
picture segmenting unit 142 segments the taken pictures into a plurality of blocks. Theprocessor 140 then determines if any blocks match. In more detail, thecamera 120 may take a formerly taken picture and then a latterly taken picture. Then, thepicture segmenting unit 142 divides the formerly taken picture (first picture) and the latterly taken picture (second picture) into a predetermined number of blocks (e.g., 9 blocks for each picture). Theprocessor 140 then determines if any of the blocks from the formerly taken picture match any of the blocks in the latterly taken picture. - If one or more blocks match, the
processor 140 matches together the two pictures at the matching blocks to determine whether the terminal has moved in a left, right, up or down direction. East, West, North and South directions may also be used. In more detail, theprocessor 140 arranges the formerly and latterly taken pictures over each other using the matching block or blocks as a reference. Theprocessor 140 is then able to estimate the motion direction of the terminal as the ‘left direction’ if the latterly picture is arranged in a left direction against the formerly taken picture. In addition, rather than using first and second pictures, the plurality of blocks may be divided at a particular position and then the divided blocks may be compared. - The
processor 140 may also obtain the motion direction of the terminal by selecting a position of a reference block in the latterly taken picture and then marking a corresponding position in the formerly taken picture. In this instance, theprocessor 140 can obtain the moving direction of the terminal by tracing the reference block from the formerly taken picture to the latterly taken picture and choosing the motion direction as a direction opposite to the obtained moving direction. - Turning next to
FIG. 3A , which is a flowchart illustrating a detailed method of determining the motion direction of the terminal according to one embodiment of the present invention.FIG. 1 will also be referred to in this description. As shown, thepicture segmenting unit 142 divides the taken pictures into a plurality of blocks of equal size (S312). - Next, the
processor 140 determines if any blocks in the first picture (formerly taken picture) match any blocks in the second picture (latterly taken picture) (S314). In more detail, each block includes a plurality of pixels that can used to see if any blocks match. - If the
processor 140 determines no blocks match (No in S314), theprocessor 140 uses a correlation process to see what blocks have a highest correlation with each other (S316). That is, theprocessor 140 searches the plurality of the blocks configuring the formerly taken picture for a block having a highest correlation with a specific block of the latterly taken picture (or vice versa). - In more detail, the
processor 140 calculates a pixel value for the plurality of pixels of each block in the latterly taken picture, and then searches for a block having a highest correlation among the blocks in the formerly taken picture. In this example, the pixel value for each block is automatically calculated by theprocessor 140. Thus, the blocks having the most similar pixel value have the highest correlation. - If the
processor 140 determines a block or blocks of the two pictures match (Yes in S314), theprocessor 140 estimates the motion direction of the terminal using the matched block(s) as a reference. Similarly, after theprocessor 140 determines the blocks that have the highest correlation in step S316, theprocessor 140 estimates the motion direction of the terminal using the highest correlation blocks as a reference. - Turning now to
FIG. 3B , which is an overview of formerly and latterly taken pictures, which are used to determine the motion direction of the terminal. In this example, each picture is divided into 9 blocks (i.e., 3*3 blocks). As discussed above, the motion direction may be represented as an ‘upward direction’, ‘downward direction’, ‘left direction’, ‘right direction’ or any combination thereof. Alternatively, the motion direction may be represented as an ‘east direction’, ‘west direction’, ‘north direction’, ‘south direction’ or any combination thereof. - Thus, with reference to
FIG. 3B , if blocks (3, 1), (3, 2) and (3, 3) in the formerly taken picture match blocks (2, 1), (2, 2) and (2, 3) in the latterly taken picture, theprocessor 140 arranges or overlays the formerly and latterly taken pictures with reference to the respectively matched blocks (see the bottom portion ofFIG. 3B ). Theprocessor 140 then estimates the motion direction of the terminal as being in the ‘right direction’ or ‘east direction’, because the latterly taken picture (second picture) is located in the ‘right direction’ or ‘east direction’ against the formerly taken picture (first picture) with reference to the matched blocks. - Similarly, although not explicitly shown in
FIG. 3B , if blocks (1, 2), (2, 2) and (3, 2) in the formerly taken picture match blocks (1, 3), (2, 3) and (3, 3) in the latterly taken picture, theprocessor 140 arranges the formerly and latterly taken pictures with reference to the respectively matched blocks and then estimates the motion direction as the ‘downward direction’ or ‘south direction’, because the latterly taken picture is downward or south of the formerly taken picture. - In yet another example, if a single block (3, 1) in the formerly taken picture matches a block (1, 3) in the latterly taken picture, the
processor 140 arranges the formerly and latterly taken pictures with reference to the matched block and then estimates the motion direction as the ‘southeast direction’, because the latterly taken picture is southeast of the formerly taken picture. - In addition, the
processor 140 can also estimate a motion direction of the terminal by calculating respectively a motion vector value of blocks included in each of the two consecutively taken pictures. For instance, if two consecutively taken pictures are segmented into 8*8 blocks, and the block (1, 1) of the formerly taken picture matches the block (4, 5) of the latterly taken picture, theprocessor 140 calculates a motion vector value by eliminating a vector value of the block (1, 1) from a vector value of the block (4, 5) using the latterly taken picture as a reference picture. Then, theprocessor 140 estimates the motion direction of the terminal using the calculated motion vector value. - In addition, similar to the above description with respect to matching blocks, the
processor 140 may also use blocks that have a highest correlation among each other. For example, when a block (2, 2) in the latterly taken picture has a highest correlation with a block (3, 2) in the formerly taken picture, theprocessor 140 arranges the formerly and latterly taken pictures with reference to the blocks having the highest correlation and then estimates the motion direction as the ‘right direction’ or ‘east direction’ (i.e., the latterly taken picture is right or east of the formerly taken picture). - In still another example, when a block (1, 3) in the latterly taken picture has a highest correlation with a block (3, 1) in the formerly inputted picture, the
processor 140 arranges the formerly and latterly taken pictures with reference to the block having the highest correlation and then estimates the motion direction as ‘southeast direction’. - Turning next to
FIG. 3C , which is a flowchart of a detailed method of estimating a motion direction of the terminal according to another embodiment of the present invention. As shown, thepicture segmenting unit 142 segments the taken picture into a plurality of blocks (S321) (similar to the step S312 inFIG. 3A ). Then, theprocessor 140 selects two consecutively taken pictures from a plurality of taken pictures and sets a reference block in a formerly taken picture of the two pictures. That is, the formerly taken picture corresponds to a picture taken before the latterly taken picture. - The
processor 140 then selects a random block among the divided blocks of the formerly taken picture as a reference block (S323), and searches the latterly taken picture to determine if the same reference block exists therein (S325). In addition, theprocessor 140 preferably selects the block located at a center among the plurality of blocks as the reference block. - Further, the
processor 140 decides a presence or non-presence of the reference block in the latterly taken picture using a block matching algorithm. In more detail, the block matching algorithm is a method of segmenting a picture into blocks equal in size and representing all pixels within each of the blocks as a motion vector. Then, the block matching algorithm finds a block most similar to a block of the latterly taken picture by moving blocks of a formerly taken picture by one pixel to find a motion vector. As a matching reference, a mean absolute error, a mean squared error, etc. may be used. - Thus, the
processor 140, which previously calculated and stored an overall pixel value for the reference block, calculates an overall pixel value for each block of the latterly taken picture and compares these pixels values to determine if the reference block in the formerly taken picture also exists in the latterly taken picture. - When the
processor 140 determines the reference block does not exist (No in S325), theprocessor 140 searches the blocks in the latterly taken picture for a block having a highest correlation with the reference block (S327). Theprocessor 140 searches the block having the highest correlation with the reference block using the block matching algorithm. Theprocessor 140 then estimates the motion direction of the terminal with reference to a position of the reference block having the highest correlation (S329). - When the
processor 140 determines the reference block does exist (Yes in S325), theprocessor 140 estimates the motion direction of the terminal with reference to a position of the reference block (S329). The motion direction is estimated in a similar manner as that discussed above with respect toFIGS. 3A and 3B . - In more detail, when the formerly and latterly taken pictures are divided into 9 blocks (3*3 blocks), the block (2, 2) in the formerly taken picture is set as the reference block, and the block (1, 3) in the latterly taken picture matches or has the highest correlation with the reference block, the
processor 140 decides that a position of the reference block has moved to (1, 3) from (2, 2). Thus, theprocessor 140 estimates the motion direction to be in a direction opposite to the moving direction of the reference block. - In another example, when the block (2, 2) in the formerly taken picture is set as the reference block and the
processor 140 determines the block (3, 2) in the latterly taken picture matches the reference block, theprocessor 140 decides that a position of the reference block has moved to (3, 2) from (2, 2). Theprocessor 140 then estimates the motion direction of the terminal as being opposite to the moving direction of the reference block. - A similar process is used for blocks having a highest correlation among each other. For instance, when the
processor 140 sets the block (2, 2) in the formerly taken picture as the reference block and determines the block (1, 3) in the latterly taken picture has the highest correlation with the reference block, theprocessor 140 decides that a position of the reference block has moved to (1, 3) from (2, 2). Theprocessor 140 then estimates a direction opposite to the moving direction of the reference block as the motion direction. - In another example, when the
processor 140 sets the block (2, 2) in the formerly taken picture as the reference block and determines the block (3, 2) in the latterly taken picture has the highest correlation with the reference block, theprocessor 140 decides that a position of the reference block has moved to (3, 2) from (2, 2). Theprocessor 140 then estimates a direction opposite to the moving direction of the reference block as the motion direction. - Turning next to
FIG. 3D , which is a flowchart of a method of estimating a motion of the terminal according to yet another embodiment of the present invention.FIG. 1 will also be referred to in this description. As shown, similar to the previous embodiments, thepicture segmenting unit 142 segments the taken picture into a plurality of blocks (S332). - The
processor 140 then determines whether or not there exists a block that is common (hereinafter called a ‘common block’) included in two consecutively taken pictures (S334). For instance, theprocessor 140 calculates and stores a pixel value for each block in the formerly taken picture and in the latterly taken picture of the two consecutively taken pictures. Then, theprocessor 140 compares the calculated pixel values of the formerly taken picture with the calculated pixel values of the latterly taken picture to determine if the common block exists in the formerly and latterly taken pictures. - If the
processor 140 determines the common block exists, theprocessor 140 searches for a position of the common block in each of the formerly and latterly taken pictures (S336). Theprocessor 140 then calculates a motion vector value of the searched common block (S338). - For example, assume the formerly and latterly taken pictures includes 9 blocks (3*3 blocks). In addition, the
processor 140 can recognize a position value of each of the 9 blocks as an (x, y) coordinate value on an X-Y coordinate plane. In particular, theprocessor 140 determines a position value (1, 1) of a random block as a coordinate value (1, 1) on the X-Y coordinate plane. - Thus, if the block (1, 2) in the formerly taken picture and the block (3, 1) in the latterly taken picture are the common blocks, the
processor 140 calculates a motion vector value with reference to the common blocks by eliminating a vector value (1, 3) of the block (1, 3) of the formerly taken picture from a vector value (3, 1) of the block (3, 1) of the latterly taken picture. In addition, if theprocessor 140 determines a plurality of common blocks exist, theprocessor 140 calculates all motion vector values for the respective common blocks. - Meanwhile, when the
processor 140 determines no common blocks exist (No in S334), theprocessor 140 searches the blocks of the formerly and latterly taken pictures for blocks having the highest correlation among each other (S340). That is, similar to the embodiment discussed above, theprocessor 140 recognizes blocks having the most similar pixel value as the blocks having the highest correlation. - Subsequently, the
processor 140 calculates a motion vector value of the searched block having the highest correlation (S342). Note the step 342 is similar to the process discussed in step S338. Next, theprocessor 140 estimates the motion direction of the terminal with reference to the calculated motion vector value (S344). - In more detail, the
processor 140 estimates the motion direction by considering a size and direction of the calculated motion vector value. Further, when there are a plurality of common blocks and thus a motion vector value for each common block, theprocessor 140 calculates an average of the plurality of the motion vector values and estimates the motion direction by considering a size and direction of the calculated average motion vector value. - Turning next to
FIG. 4 , which is a block diagram of a mobile communication terminal for estimating a motion direction of the terminal according to another embodiment of the present invention. As shown, the terminal includes theinput unit 110, thecamera 120, thememory 130, theprocessor 140, thepicture segmenting unit 410 and thedisplay 150 similar to that as shown inFIG. 1 . In addition, the terminal inFIG. 4 also includes acall connecting unit 440, and theprocessor 140 further includes acharacter extracting unit 420 and adrive unit 430. - Next,
FIG. 5 is a flowchart illustrating a method of extracting and displaying information corresponding to the estimated motion direction of the terminal.FIG. 4 will also be referred to in this description. As shown, character data corresponding to a motion direction of the terminal is stored in the memory 130 (S510). That is, thememory 130 stores all character information, languages, symbols, numerals, etc., supported and provided by the terminal. - Moreover, the
memory 130 stores the character information, languages, symbols, numerals, etc. in a format that corresponds to motion directions. For instance, thememory 130 may store character data such as ‘’ indicating a ‘right and downward’ direction, ‘’ indicating a ‘downward and right’ direction, ‘’ indicating a ‘right, downward and right’ direction, etc. Further, thememory 130 may store character data such as ‘’ indicating a ‘downward, upward and right’ direction, ‘⊥’ indicating a ‘downward, left and right’ direction, etc. - Further, similar to the embodiment discussed above, the user can select a function indicating specific information according to the motion direction of the terminal by selecting a key or keys on the input unit 110 (S520). The
camera 120 then takes a plurality of pictures (S530), the pictures are stored in thememory 130, and theprocessor 140 estimates the motion direction of the terminal using the plurality of the taken pictures (S550) in a similar manner as discussed above. - Meanwhile, the
character extracting unit 420 extracts character data corresponding to the estimated motion direction from the character data stored in the memory 130 (S560). As discussed above, the character data accurately indicates the estimated motion direction. For instance, when the processor estimates the motion direction of the terminal as being a ‘downward, downward, downward, right and right’ direction, thecharacter extracting unit 420 recognizes the motion direction as ‘downward and right’ and then extracts ‘’ from thememory 130. - Similarly, when the
processor 140 estimates the motion direction as being a ‘right, right, right, downward, downward and downward’, thecharacter extracting unit 420 recognizes the motion direction as ‘right and downward’ and then extracts ‘’ from thememory 130. Also, when theprocessor 140 estimates the motion direction as being a ‘downward, downward, downward, upward, upward, right and right’, thecharacter extracting unit 420 recognizes the motion direction as ‘downward, upward and right’ and then extracts ‘’ from thememory 130. - Further, when the
processor 140 estimates the motion direction as being a ‘downward, downward, left, left, right, right, right and right’, thecharacter extracting unit 420 recognizes the motion direction as ‘downward, left and right’ and then extracts ‘⊥’ from thememory 130. Thedisplay 150 then displays the extracted character on the display 150 (S570). For instance, if the extracted character data are ‘’ and ‘’, thedisplay 150 displays ‘’. If the extracted character data are ‘’ and ‘⊥’, thedisplay 150 displays ‘’. Other symbols such as arrows, etc. may be used to display the estimated direction of the terminal. - Turning next to
FIG. 6A , which is a flowchart illustrating a method of using extracted character data according to an embodiment of the present invention.FIG. 4 will also be referred to in this description. As shown inFIG. 6A , theprocessor 140 determines whether or not the extracted character data indicates a prescribed function set in the terminal (S612). - For example, when the extracted character data indicates a name of a prescribed function set in the terminal or an abbreviated number corresponding to the prescribed function, the
processor 140 determines that the prescribed function is indicated (Yes in S612). For instance, if the extracted character data is ‘message’, theprocessor 140 determines a message function set in the terminal is to be executed. - Similarly, if the character data is a number (e.g., the number “7”) corresponding to the message function, the
processor 140 determines the message function is to be executed. Thus, when theprocessor 140 determines the character data indicates a prescribed function, theinput unit 110 generates an execution signal for executing the prescribed function (S614). - The
input unit 110 generates an execution signal according to a user's selection. However, if a generation of the execution signal is set to ‘default’, theinput unit 110 automatically generates the execution signal. Thedrive unit 430 then executes the prescribed function according to the execution signal (S616). That is, thedrive unit 430 includes drive programs for all functions set in the terminal so as to execute the prescribed function. - Next, another method of using extracted character data according to an embodiment of the present invention will be described with reference to
FIG. 6B .FIG. 4 will also be referred to in this description. As shown inFIG. 6B , theprocessor 140 determines whether or not the extracted character data indicates a contact party stored in the terminal (S622). - If the extracted character data indicates the name of the contact party terminal or a number corresponding to the prescribed name (Yes in S622), the
input unit 110 generates a call connection signal to the contact party corresponding to the prescribed name (S624). - That is, the
input unit 110 generates the call connection signal according to a user's selection. However, if the generation of the call connection signal is set to ‘default’, theinput unit 110 automatically generates the call connection signal. Then, thecall connecting unit 440 connects a call to the contact party corresponding to the name (S626). - Turning next to
FIG. 7 , which is a flowchart of a method of displaying pages of information corresponding to the estimate motion direction of the terminal according to still another embodiment of the present invention.FIG. 1 will also be referred to in this description. - Note that Steps S710, S720, S730, and S740 are the same as the corresponding steps S210, S220, S230 and S240 in
FIG. 2 . Accordingly, a detailed explanation of these steps inFIG. 7 will not be repeated. - In addition, in step S750, the
processor 140 determines whether or not thedisplay 150 is currently displaying only partial information out of a total amount of information (S750). In more detail, the total amount of information may be a plurality of pages and the partial information may be a specific one of the plurality of the pages. - For instance, when the
display 150 displays a phone number presented on a specific page among a plurality of pages displaying phone numbers stored in the terminal, theprocessor 140 determines that the partial information is currently being displayed. - Further, the total amount of information may be a whole picture and the partial information may be a portion of the whole picture. For instance, if the
display 150 displays a specific area on a whole country map or a specific part of a whole photograph picture, theprocessor 140 determines that partial information is currently being displayed. - In addition, the total amount of information may be information for a plurality of functions set in the terminal and the partial information may be information for a specific one of the plurality of the functions. For instance, if the
display 150 currently displays information for a ‘message function’ from among a plurality of functions set in the terminal, theprocessor 140 determines that partial information is currently being displayed. Further, the information for the message function includes a received message list, a sent message list, specific message content, etc. - Also, if the
processor 140 determines the display is currently displaying partial information (Yes in S750), thedisplay 150 displays another partial information corresponding to the estimated motion direction. For example, if the estimated motion direction is an ‘upward direction’, thedisplay 150 displays a front page of a currently displayed page. - Further, if the estimated motion direction is a ‘right direction’, the
display 150 displays a last page of a plurality of pages by moving away from a currently displayed page. In addition, the page moving direction corresponding to the estimated motion direction can be previously set in the mobile terminal or can be set by a user. - Thus, according to the page moving direction corresponding to the estimated motion direction, the
processor 140 controls thedisplay 150 to display another page moved from a currently displayed page. For example, if the estimated motion direction is an ‘upward direction’, thedisplay 150 displays an area away from a specific area currently displayed on a whole country map in the ‘upward direction’. - Alternatively, according to the estimated motion direction and distance, the
display 150 displays an area moved from a specific area currently displayed on the whole country map. According to the estimated motion direction and distance, theprocessor 140 controls another picture moved from a currently displayed partial picture for the whole picture to be displayed. - In another example, if the estimated motion direction is an ‘upward direction’, the
display 150 stops displaying information for a currently displayed function and then displays information for a function corresponding to the ‘upward direction’. That is, if a motion direction is estimated, theprocessor 140, in which a moving direction of a function for the estimated motion direction is previously set, controls thedisplay 150 to display information for the moved function according to the estimated motion direction. - Accordingly, the present invention provides the following effects or advantages.
- First, the motion direction of the terminal can be estimated using a plurality of pictures taken via a photographing device provided with the terminal. Secondly, because the motion direction is estimated, a specific character corresponding to the estimated motion direction can also be displayed.
- Thirdly, because the motion direction of the terminal is estimated, partial information can be displayed to correspond to the estimated motion direction. Fourthly, because character data corresponding to an estimated motion direction is extracted, a specific function indicated by the extracted character data can be executed.
- Therefore, the present invention advantageously estimates the motion direction of the terminal and displays information according to the estimated motion direction (and thus the user can see the information according to the estimated motion direction), estimates the motion direction and activates a specific function on the terminal based on the estimated motion direction of the terminal (and thus the user need only move the terminal in a certain direction to activate a specific function), and estimates the motion direction and displays different pages of information based the motion direction of the terminal (and thus the user can page through a plurality of pages by moving the phone in a certain direction).
- It will be apparent to those skilled in the art that various modifications and variations can be made in the present invention without departing from the spirit or scope of the inventions. Thus, it is intended that the present invention covers the modifications and variations of this invention provided they come within the scope of the appended claims and their equivalents.
Claims (24)
1. A mobile communication terminal, comprising:
a camera configured to take a plurality of pictures;
a processor configured to estimate a motion direction of the terminal using at least some of the plurality of pictures; and
a display configured to display information corresponding to the estimated motion direction.
2. The mobile terminal of claim 1 , wherein the plurality of pictures comprise at least a formerly and latterly consecutively taken picture, and
wherein the processor divides the formerly taken picture into a plurality of first blocks and divides the latterly taken picture into a plurality of second blocks.
3. The mobile terminal of claim 2 , wherein the processor determines if at least one block from the plurality of first blocks matches at least one block from the plurality of second blocks, and if the blocks match, estimates the motion direction of the terminal by arranging the formerly and latterly taken pictures together with matching blocks being used as a reference.
4. The mobile terminal of claim 3 , wherein if the processor determines the at least one block in the first blocks does not match the at least one block in the second blocks, the processor determines which block from the first blocks has a highest correlation with a block from the second blocks, and uses the blocks with the highest correlation as the reference when estimating the motion direction of the terminal.
5. The mobile terminal of claim 2 , wherein the processor selects a reference block in the first blocks and determines whether the reference block is also included in the second blocks, and if the reference block is in both of the first and second blocks, the processor estimates the motion direction of the terminal by arranging the formerly and latterly taken pictures together with reference blocks being used as a reference.
6. The mobile terminal of claim 5 , wherein if the processor determines the reference block is not also included in the second blocks, the processor determines which block from the first blocks has a highest correlation with a block from the second blocks, and uses the blocks with the highest correlation as the reference when estimating the motion direction of the terminal.
7. The mobile terminal of claim 2 , wherein the processor determines whether a common block exists in both of the first and second blocks, and if the common block exists in both of the first and second blocks, the processor determines a position of the common block in both of the first and second blocks, calculates a motion vector indicating a movement of the common block from the position in the first blocks to the position in the second blocks, and estimates the motion direction of the terminal based on the calculated motion vector.
8. The mobile terminal of claim 7 , wherein if the processor determines the common block does not exist in both of the first and second blocks, the processor determines which block from the first blocks has a highest correlation with a block from the second blocks, and uses the blocks with the highest correlation as the reference when estimating the motion direction of the terminal.
9. The mobile terminal of claim 2 , further comprising a memory configured to store the plurality of pictures and information data corresponding to the estimated motion direction.
10. The mobile terminal of claim 9 , wherein the processor executes a prescribed function on the mobile terminal if the extracted information data indicates the prescribed function is to be executed or the processor displays the information data corresponding to the estimated motion direction on the display.
11. The mobile terminal of claim 9 , wherein the extracted information data comprises an a plurality of pieces of information, and the processor determines if a piece of the information is currently being displayed on the display, and displays a next piece of information from the plurality of pieces of information if it is determined the piece of information is currently being displayed based on the estimated motion direction.
12. The mobile terminal of claim 1 , wherein the camera takes the plurality of pictures consecutively at a constant rate.
13. A method of estimating a motion direction in a mobile communication terminal, comprising:
taking a plurality of pictures with a camera on the terminal;
estimating the motion direction of the terminal using at least some of the plurality of pictures; and
displaying information corresponding to the estimated motion direction.
14. The method of claim 13 , wherein the plurality of pictures comprise at least a formerly and latterly consecutively taken picture, and
wherein the estimating step further comprises dividing the formerly taken picture into a plurality of first blocks and dividing the latterly taken picture into a plurality of second blocks.
15. The method of claim 14 , wherein the estimating step further comprises determining if at least one block from the plurality of first blocks matches at least one block from the plurality of second blocks, and if the blocks match, estimating the motion direction of the terminal by arranging the formerly and latterly taken pictures together with matching blocks being used as a reference.
16. The method of claim 15 , wherein if the processor determines the at least one block in the first blocks does not match the at least one block in the second blocks, the estimating step further comprises determining which block from the first blocks has a highest correlation with a block from the second blocks, and using the blocks with the highest correlation as the reference when estimating the motion direction of the terminal.
17. The method of claim 14 , wherein the estimating step further comprises selecting a reference block in the first blocks and determining whether the reference block is also included in the second blocks, and if the reference block is in both of the first and second blocks, the estimating step further comprises estimating the motion direction of the terminal by arranging the formerly and latterly taken pictures together with reference blocks being used as a reference.
18. The method of claim 17 , wherein if the estimating step determines the reference block is not also included in the second blocks, the estimating step further comprises determining which block from the first blocks has a highest correlation with a block from the second blocks, and uses the blocks with the highest correlation as the reference when estimating the motion direction of the terminal.
19. The method of claim 14 , wherein the estimating step further comprises determining whether a common block exists in both of the first and second blocks, and if the common block exists in both of the first and second blocks, the estimating step further comprises determining a position of the common block in both of the first and second blocks, calculating a motion vector indicating a movement of the common block from the position in the first blocks to the position in the second blocks, and estimating the motion direction of the terminal based on the calculated motion vector.
20. The method of claim 19 , wherein if the estimating step determines the common block does not exist in both of the first and second blocks, the estimating step further comprises determining which block from the first blocks has a highest correlation with a block from the second blocks, and using the blocks with the highest correlation as the reference when estimating the motion direction of the terminal.
21. The method of claim 14 , further comprising storing in a memory of the terminal the plurality of pictures and information data corresponding to the estimated motion direction.
22. The method of claim 21 , wherein the estimating step further comprises executing a prescribed function on the mobile terminal if the extracted information data indicates the prescribed function is to be executed or displaying the information data corresponding to the estimated motion direction.
23. The method of claim 21 , wherein the extracted information data comprises an a plurality of pieces of information, and the estimating step further comprises determining if a piece of the information is currently being displayed, and displaying a next piece of information from the plurality of pieces of information if it is determined the piece of information is currently being displayed based on the estimated motion direction.
24. The method of claim 13 , wherein the camera takes the plurality of pictures consecutively at a constant rate.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020050067204A KR100628101B1 (en) | 2005-07-25 | 2005-07-25 | Mobile telecommunication device having function for inputting letters and method thereby |
KR10-2005-0067204 | 2005-07-25 |
Publications (1)
Publication Number | Publication Date |
---|---|
US20070019734A1 true US20070019734A1 (en) | 2007-01-25 |
Family
ID=37309336
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US11/491,147 Abandoned US20070019734A1 (en) | 2005-07-25 | 2006-07-24 | Mobile communication terminal for estimating motion direction and method thereof |
Country Status (6)
Country | Link |
---|---|
US (1) | US20070019734A1 (en) |
EP (1) | EP1748388B1 (en) |
JP (1) | JP4611257B2 (en) |
KR (1) | KR100628101B1 (en) |
CN (1) | CN100527896C (en) |
AT (1) | ATE554464T1 (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105657434A (en) * | 2016-01-20 | 2016-06-08 | 同济大学 | Big data aided video transmission method based on digital-analog hybrid |
US10426998B1 (en) * | 2014-04-25 | 2019-10-01 | Arizona Board Of Regents On Behalf Of The University Of Arizona | Portable device for movement and resistance training of the lower extremities |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP2015163B1 (en) * | 2007-06-05 | 2013-04-24 | Gerhard Wergen | Method for transferring characters and device for executing such a method |
US9818196B2 (en) | 2014-03-31 | 2017-11-14 | Xiaomi Inc. | Method and device for positioning and navigating |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5973680A (en) * | 1995-02-09 | 1999-10-26 | Nec Corporation | Motion picture retrieval system |
US6564144B1 (en) * | 2002-01-10 | 2003-05-13 | Navigation Technologies Corporation | Method and system using a hand-gesture responsive device for collecting data for a geographic database |
US20040042552A1 (en) * | 2000-07-20 | 2004-03-04 | Dvorkovich Victor Pavlovich | Method and apparatus for determining motion vectors in dynamic images |
US20050033512A1 (en) * | 2003-08-05 | 2005-02-10 | Research In Motion Limited | Mobile device with on-screen optical navigation |
US20050043066A1 (en) * | 2003-08-23 | 2005-02-24 | Sang-Uk Seo | Method and system for controlling notification of call reception in a mobile terminal |
US20050048977A1 (en) * | 2003-08-26 | 2005-03-03 | Motorola, Inc. | System and method to improve WLAN handover behavior and phone battery life when stationary in border cells |
US20050151724A1 (en) * | 2003-12-29 | 2005-07-14 | Chun-Huang Lin | Pointing device and displacement estimation method |
US7301529B2 (en) * | 2004-03-23 | 2007-11-27 | Fujitsu Limited | Context dependent gesture response |
Family Cites Families (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5453800A (en) * | 1991-10-17 | 1995-09-26 | Sony Corporation | Apparatus for judging a hand movement of an image |
JPH10240436A (en) * | 1996-12-26 | 1998-09-11 | Nikon Corp | Information processor and recording medium |
US6288704B1 (en) * | 1999-06-08 | 2001-09-11 | Vega, Vista, Inc. | Motion detection and tracking system to control navigation and display of object viewers |
JP3853542B2 (en) * | 1999-07-26 | 2006-12-06 | パイオニア株式会社 | Image processing apparatus, image processing method, and navigation apparatus |
JP3412592B2 (en) * | 2000-02-08 | 2003-06-03 | 松下電器産業株式会社 | Personal information authentication method |
JP2002259045A (en) * | 2001-02-26 | 2002-09-13 | Ntt Docomo Inc | Method and device for inputting handwritten data, method and device for inputting movement data, and method and device for authenticating individual |
JP4240859B2 (en) * | 2001-09-05 | 2009-03-18 | 株式会社日立製作所 | Portable terminal device and communication system |
WO2004066615A1 (en) * | 2003-01-22 | 2004-08-05 | Nokia Corporation | Image control |
JP2004312194A (en) * | 2003-04-03 | 2004-11-04 | Konica Minolta Photo Imaging Inc | Mobile phone with pointing device function and mobile phone |
JP2004318793A (en) * | 2003-04-17 | 2004-11-11 | Kenichi Horie | Information terminal based on operator's head position |
JP4036168B2 (en) * | 2003-09-09 | 2008-01-23 | 株式会社日立製作所 | mobile phone |
JP2005092839A (en) * | 2003-09-16 | 2005-04-07 | Kenichi Horie | Information terminal utilizing operator's minute portion as base |
JP2005092419A (en) * | 2003-09-16 | 2005-04-07 | Casio Comput Co Ltd | Information processing apparatus and program |
JP2005173877A (en) * | 2003-12-10 | 2005-06-30 | Sony Ericsson Mobilecommunications Japan Inc | Personal digital assistant |
JP4047822B2 (en) * | 2004-02-27 | 2008-02-13 | ソフトバンクモバイル株式会社 | Electronics |
-
2005
- 2005-07-25 KR KR1020050067204A patent/KR100628101B1/en not_active IP Right Cessation
-
2006
- 2006-07-24 US US11/491,147 patent/US20070019734A1/en not_active Abandoned
- 2006-07-25 CN CNB2006101085197A patent/CN100527896C/en not_active Expired - Fee Related
- 2006-07-25 EP EP06015521A patent/EP1748388B1/en not_active Not-in-force
- 2006-07-25 AT AT06015521T patent/ATE554464T1/en active
- 2006-07-25 JP JP2006202597A patent/JP4611257B2/en not_active Expired - Fee Related
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5973680A (en) * | 1995-02-09 | 1999-10-26 | Nec Corporation | Motion picture retrieval system |
US20040042552A1 (en) * | 2000-07-20 | 2004-03-04 | Dvorkovich Victor Pavlovich | Method and apparatus for determining motion vectors in dynamic images |
US6564144B1 (en) * | 2002-01-10 | 2003-05-13 | Navigation Technologies Corporation | Method and system using a hand-gesture responsive device for collecting data for a geographic database |
US20050033512A1 (en) * | 2003-08-05 | 2005-02-10 | Research In Motion Limited | Mobile device with on-screen optical navigation |
US20050043066A1 (en) * | 2003-08-23 | 2005-02-24 | Sang-Uk Seo | Method and system for controlling notification of call reception in a mobile terminal |
US20050048977A1 (en) * | 2003-08-26 | 2005-03-03 | Motorola, Inc. | System and method to improve WLAN handover behavior and phone battery life when stationary in border cells |
US20050151724A1 (en) * | 2003-12-29 | 2005-07-14 | Chun-Huang Lin | Pointing device and displacement estimation method |
US7301529B2 (en) * | 2004-03-23 | 2007-11-27 | Fujitsu Limited | Context dependent gesture response |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10426998B1 (en) * | 2014-04-25 | 2019-10-01 | Arizona Board Of Regents On Behalf Of The University Of Arizona | Portable device for movement and resistance training of the lower extremities |
CN105657434A (en) * | 2016-01-20 | 2016-06-08 | 同济大学 | Big data aided video transmission method based on digital-analog hybrid |
Also Published As
Publication number | Publication date |
---|---|
EP1748388A3 (en) | 2010-07-07 |
EP1748388A2 (en) | 2007-01-31 |
CN100527896C (en) | 2009-08-12 |
KR100628101B1 (en) | 2006-09-26 |
JP2007035040A (en) | 2007-02-08 |
ATE554464T1 (en) | 2012-05-15 |
CN1905722A (en) | 2007-01-31 |
JP4611257B2 (en) | 2011-01-12 |
EP1748388B1 (en) | 2012-04-18 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20200349386A1 (en) | Storing Information for Access Using a Captured Image | |
US11663695B2 (en) | Flexible display device and method for changing display area | |
US10129385B2 (en) | Method and apparatus for generating and playing animated message | |
US20090167919A1 (en) | Method, Apparatus and Computer Program Product for Displaying an Indication of an Object Within a Current Field of View | |
US8769437B2 (en) | Method, apparatus and computer program product for displaying virtual media items in a visual media | |
US9122707B2 (en) | Method and apparatus for providing a localized virtual reality environment | |
EP2420955A2 (en) | Terminal device and method for augmented reality | |
JP4782105B2 (en) | Image processing apparatus and image processing method | |
EP1559264A1 (en) | Method and system for conducting image processing from a mobile client device | |
KR20110052124A (en) | Method for generating and referencing panorama image and mobile terminal using the same | |
CN104573675A (en) | Operating image displaying method and device | |
CN108256523A (en) | Recognition methods, device and computer readable storage medium based on mobile terminal | |
CN111833234B (en) | Image display method, image processing apparatus, and computer-readable storage medium | |
US20070019734A1 (en) | Mobile communication terminal for estimating motion direction and method thereof | |
KR20190104260A (en) | Shape detection | |
US20110148934A1 (en) | Method and Apparatus for Adjusting Position of an Information Item | |
CN110266926B (en) | Image processing method, image processing device, mobile terminal and storage medium | |
CN108353210B (en) | Processing method and terminal | |
EP1351189A1 (en) | Image processing | |
CN113496226B (en) | Character selection method and device based on character recognition and terminal equipment | |
US9165339B2 (en) | Blending map data with additional imagery | |
CN108022280B (en) | Combined instrument animation display method and combined instrument animation data processing method | |
CN113744172A (en) | Document image processing method and device and training sample generation method and device | |
JP2002016796A (en) | Mobile communication terminal |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: LG ELECTRONICS INC., KOREA, REPUBLIC OF Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:LEE, SEUNG MIN;REEL/FRAME:018123/0974 Effective date: 20060718 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |