WO2007000491A1 - Method, device, and program product for visually indicating a selected object, using a device equipped with camera - Google Patents
Method, device, and program product for visually indicating a selected object, using a device equipped with camera Download PDFInfo
- Publication number
- WO2007000491A1 WO2007000491A1 PCT/FI2006/050279 FI2006050279W WO2007000491A1 WO 2007000491 A1 WO2007000491 A1 WO 2007000491A1 FI 2006050279 W FI2006050279 W FI 2006050279W WO 2007000491 A1 WO2007000491 A1 WO 2007000491A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- text
- subject
- map
- indicator
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 20
- 230000003287 optical effect Effects 0.000 claims description 14
- 230000015654 memory Effects 0.000 claims description 13
- 238000004458 analytical method Methods 0.000 claims description 5
- 238000010191 image analysis Methods 0.000 claims description 3
- 238000003384 imaging method Methods 0.000 description 9
- 238000012015 optical character recognition Methods 0.000 description 9
- 238000005516 engineering process Methods 0.000 description 5
- 230000000875 corresponding effect Effects 0.000 description 4
- 230000006870 function Effects 0.000 description 2
- 238000005259 measurement Methods 0.000 description 2
- 230000001944 accentuation Effects 0.000 description 1
- 230000003190 augmentative effect Effects 0.000 description 1
- 230000001276 controlling effect Effects 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 238000012905 input function Methods 0.000 description 1
- PWPJGUXAGUPAHP-UHFFFAOYSA-N lufenuron Chemical compound C1=C(Cl)C(OC(F)(F)C(C(F)(F)F)F)=CC(Cl)=C1NC(=O)NC(=O)C1=C(F)C=CC=C1F PWPJGUXAGUPAHP-UHFFFAOYSA-N 0.000 description 1
- 238000003909 pattern recognition Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/18—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 for optical projection, e.g. combination of mirror and condenser and objective
- G02B27/20—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 for optical projection, e.g. combination of mirror and condenser and objective for imaging minute objects, e.g. light-pointer
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M1/00—Substation equipment, e.g. for use by subscribers
- H04M1/72—Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
- H04M1/724—User interfaces specially adapted for cordless or mobile telephones
- H04M1/72403—User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
- H04M1/72427—User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality for supporting games or graphical animations
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/34—Route searching; Route guidance
- G01C21/36—Input/output arrangements for on-board computers
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B17/00—Details of cameras or camera bodies; Accessories therefor
- G03B17/02—Bodies
- G03B17/06—Bodies with exposure meters or other indicators built into body but not connected to other camera members
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B21/00—Projectors or projection-type viewers; Accessories therefor
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/10—Image acquisition
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V30/00—Character recognition; Recognising digital ink; Document-oriented image-based pattern recognition
- G06V30/10—Character recognition
- G06V30/14—Image acquisition
- G06V30/1444—Selective acquisition, locating or processing of specific regions, e.g. highlighted text, fiducial marks or predetermined fields
- G06V30/1456—Selective acquisition, locating or processing of specific regions, e.g. highlighted text, fiducial marks or predetermined fields based on user interactions
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V30/00—Character recognition; Recognising digital ink; Document-oriented image-based pattern recognition
- G06V30/10—Character recognition
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M2250/00—Details of telephonic subscriber devices
- H04M2250/10—Details of telephonic subscriber devices including a GPS signal receiver
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M2250/00—Details of telephonic subscriber devices
- H04M2250/12—Details of telephonic subscriber devices including a sensor for measuring a physical value, e.g. temperature or motion
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M2250/00—Details of telephonic subscriber devices
- H04M2250/52—Details of telephonic subscriber devices including functional features of a camera
Definitions
- the present invention relates to a method, device, and program product for visually indicating a selected object in a desired subject, in which a bit image is taken of the subject, and is analysed in order to find the occurrence of the selected object form the bit image.
- the present invention is intended to create a new method and device for solving the aforementioned problems .
- the characteristic features of the method according to the invention are stated in Claim 1 and correspondingly the characteristic fea- tures of the device are stated in Claim 8.
- the characteristics of the program product designed to implement the method are stated in Claim 12.
- the invention combines a camera and projector, in such a way that some indicator, for example, a frame, point, or arrow can be projected in the subject photographed by the camera.
- the subject can be, for example, a list or map.
- the camera is first of all used to take an image, which is analysed and, on the basis of the analysis, accentuation is re- fleeted by a projector onto the imaged subject.
- OCR Optical Character Recognition
- OCR scanning according to the invention the bit positions corresponding to the detected text are also recorded. Equally, it is possible to use other known pattern recognition, as long as the pattern can be defined and it can appear in the subject .
- the procedure can operate as follows. Because the camera and projector are oriented in the same direction, i.e. the viewing images point in the same direction, the projection can be created in the same direction in which the detail is seen in the camera. The projected image does not appear in precisely the same location as the detected detail, because the projector cannot be in precisely the same place in the device as the camera is. It is possible to determine how many degrees the position of the image differs from that of the projection and arrange accordingly the angle of the indicator being projected. The distance of the subject from the device can be de- fined for this correction. If several distances are defined from different sides of the subject, a virtual three- dimensional model of it can be created.
- OCR technology can be used to scan typewritten text in the im- age area into a character form. At the same time, a sufficient number of the bit positions of the detected text are stored.
- the user can input text, speech, or other input to the search application.
- the input methods can be augmented by using scanned text data or images (pattern) .
- a speech recognition element for example, can be used to limit the number of alternatives in connection with ambiguous situations.
- Use of the search application ranges from searching for contact data in a dense list to finding a specific street on a map.
- Figures Ia and Ib show the use of the device according to the invention in determining a location on a map
- Figures 2a and 2b show a way to search for a name from a list
- Figure 3 shows the device being used in levelling
- Figure 4 shows the mutual geometry of the camera and the projector
- Figure 5 shows a flow chart of a search for a name from a list.
- the device according to the invention is marked with the reference number 10. It can be a mobile station or a PDA device.
- the device 10 there is generally a microprocessor, a program and data memory, a display 14, and a keypad 12 and/or other I/O means.
- an operating system in which an application program is installed, which performs imaging and projection, as well as image analysis.
- the miniature laser projector shown is made by Symbol Technologies Inc., USA (http://www.symbol.com/products/oem/lpd.htmlftl) .
- the user has a map M, the position of the imaging of which is input to the application (not shown) . This can take place either by entering me- chanically the values of the detectable co-ordinates, or even by reading such information directly from the map, with the aid of OCR recognition.
- Figure Ia once the user has started the application, it takes an image of the map, which it stores as a bit image in the memory.
- the absolute position of the device 10 can be identified using a GPS-positioning device.
- the application calculates the position data to the bit image of the image, from which it is possible in turn to calculate the parameters for the optical indicator, in this case for the control of the laser projector.
- the device shows the position "You are here' on the map M, with the aid of a laser dot, square, or other similar mark.
- the above embodiment can be adapted, for instance in such a way that, instead of the GPS positioning, the local street name is entered and, on the other hand texts, i.e. character strings are detected from the bit image using OCR technology, and are stored with the position data.
- the input is compared with the OCR-recognized texts, in which case a hit will give the position data of the street in question.
- the projection parameters are calculated as above.
- a task is resolved, in which a telephone number is sought from a dense list of names L.
- an image is taken of the list of names and stored as a bit image in the memory.
- the application can then perform effective OCR recognition and store and the names and telephone numbers as character strings, as well as the position in the bit image of each character string.
- the selected name is given as input, which the application retrieves from the stored character strings.
- the character string according to the hit also gives the position data, by means of which the control parameters are defined, for showing the laser image (for example, a rectangle) at the name.
- the precision of the projection can be increased by re-imaging the subject after analysis, which retrieves the change in position of the camera immediately prior to projecting. An approximate and thus rapid image analysis is sufficient for this.
- BP can be set on the surface either horizontally or verti- cally, as required.
- the orientation information is obtained either from the rest of the structure, or by using an angle sensor 22 contained in the device.
- the lens of the camera 16 and the projector 20 are at a distance from each other.
- the camera 16 has an angle of vision 16' while the projector 20 has an uncorrected projection angle 20'.
- the corrected projec- tion angle 20" strikes the same area in the subject M only at a specific distance.
- the projection can be rotated so that the centre of projection and the camera's imag- ing centre coincide.
- the power of the laser projector can also be adjusted optimally.
- the projected dot or image can be exploited in the distance measurement.
- the distance of the subject can be measured with the aid of a simple geometric equation, if the projection angle and imaging angle (from the image positions) and the geometry of the device are known.
- the operation of the device takes place with the aid of an application program arranged on top of the operating system.
- the flow chart of Figure 5 shows the operation of Figures 2a and 2b.
- the program branches into an imaging function 100 and an input function 101, in which the user is requested for a name from list that is displayed.
- the imaging function 100 the user is expected to aim the camera at the list and trigger imaging, in which case the image that is created is stored in the device's memory as a bit image.
- This is followed by text recognition, stage 102, in which OCR technology is used to search for written characters, i.e. lines of text (or other agreed patterns) , from the bit image stored in the memory area.
- the detected character strings and the bit positions corresponding to them are stored.
- the name input is compared with the text in the text lines, stage 103. This is a simple text-search routine.
- the position values corresponding to the hit are also stored. These values are used to define the orientation parameters of the laser projector, stage 104.
- the angle position of the subject is detected using approximate and thus rapid optical detection. This is used to correct the orientation pa- rameters, if the device has turned relative to the original imaging position.
- the method according to the invention can be implemented using a program product in a device, which comprises a computer with memories and an I/O element.
- the program product comprises a computer-readable memory medium, in which there are computer- readable program components, which are characterized in that the said components comprise - a first component to be run programmatically for controlling the camera for taking an image and for storing the image obtained as a bit image of a selected size in a selected memory area of the said memory, a second component to be run programmatically for analys- ing the said bit image in order to find a selected object image from the bit image and for storing the bit positions of the found object image, and a third component to be run programmatically for calculating the orientation values form the said bit positions and for transmitting them to the said control unit, in order to orientate the optical indicator according to the found position of the object image.
- the said second program-code component con- tains a routine for entering text describing the object, and a routine for performing OCR text recognition in order to find text objects from the bit image and at the same time for storing the positions of the found text objects, and - a routine for comparing the text describing the object with the detected text objects and for picking the bit positions of the found text object for use in the said third component to be performed.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Radar, Positioning & Navigation (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Remote Sensing (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Signal Processing (AREA)
- Computer Networks & Wireless Communication (AREA)
- Automation & Control Theory (AREA)
- Optics & Photonics (AREA)
- Character Input (AREA)
Abstract
The invention relates to a method and device for visually indicating a selected object (P) in a desired subject (M) using a device (10) equipped with a camera (16) , in which an image is taken of the subject, and is analysed in order to find the occurrence of the selected object (P) from the bit image, which image is analysed in order to find the occurrence of the selected object (P) in the image and the orientation parame- ters of the indicator are calculated.
Description
METHOD, DEVICE, AND PROGRAM PRODUCT FOR VISUALLY INDICATING A SELECTED OBJECT, USING A DEVICE EQUIPPED WITH CAMERA
The present invention relates to a method, device, and program product for visually indicating a selected object in a desired subject, in which a bit image is taken of the subject, and is analysed in order to find the occurrence of the selected object form the bit image.
It is quite difficult to find a name and its related telephone number from a densely written list. Similarly, it is difficult to search for a specific street on a map. Another difficult situation is when someone has a map in their hand, but does not known where they are on the map.
The present invention is intended to create a new method and device for solving the aforementioned problems . The characteristic features of the method according to the invention are stated in Claim 1 and correspondingly the characteristic fea- tures of the device are stated in Claim 8. In addition, the characteristics of the program product designed to implement the method are stated in Claim 12.
The invention combines a camera and projector, in such a way that some indicator, for example, a frame, point, or arrow can be projected in the subject photographed by the camera. The subject can be, for example, a list or map. In the method, the camera is first of all used to take an image, which is analysed and, on the basis of the analysis, accentuation is re- fleeted by a projector onto the imaged subject. In this case, OCR (Optical Character Recognition) technology, which is as such known, can be exploited to detect text from a bit image. In OCR scanning according to the invention, the bit positions corresponding to the detected text are also recorded. Equally, it is possible to use other known pattern recognition, as long
as the pattern can be defined and it can appear in the subject .
The procedure can operate as follows. Because the camera and projector are oriented in the same direction, i.e. the viewing images point in the same direction, the projection can be created in the same direction in which the detail is seen in the camera. The projected image does not appear in precisely the same location as the detected detail, because the projector cannot be in precisely the same place in the device as the camera is. It is possible to determine how many degrees the position of the image differs from that of the projection and arrange accordingly the angle of the indicator being projected. The distance of the subject from the device can be de- fined for this correction. If several distances are defined from different sides of the subject, a virtual three- dimensional model of it can be created.
OCR technology can be used to scan typewritten text in the im- age area into a character form. At the same time, a sufficient number of the bit positions of the detected text are stored.
After this procedure, it is possible to perform text retrieval from the scanned text using selected character strings, for example a name. When a hit is found, the bit position corre- sponding to it is recorded for later use.
The user can input text, speech, or other input to the search application. The input methods can be augmented by using scanned text data or images (pattern) . When a hit is found from the scanned text, a speech recognition element, for example, can be used to limit the number of alternatives in connection with ambiguous situations.
Use of the search application ranges from searching for contact data in a dense list to finding a specific street on a map.
In the following, the invention is examined with reference to the accompanying drawings, which show some embodiments of the invention.
Figures Ia and Ib show the use of the device according to the invention in determining a location on a map,
Figures 2a and 2b show a way to search for a name from a list,
Figure 3 shows the device being used in levelling,
Figure 4 shows the mutual geometry of the camera and the projector, and
Figure 5 shows a flow chart of a search for a name from a list.
In the figures, the device according to the invention is marked with the reference number 10. It can be a mobile station or a PDA device. In the device 10, there is generally a microprocessor, a program and data memory, a display 14, and a keypad 12 and/or other I/O means. In terms of the present invention, it is essential that there is a camera 16 and a miniature laser projector 20 in the device. In the application of Figures Ia and Ib, there are also GPS positioning means in the device. In the device 10, there is an operating system, in which an application program is installed, which performs imaging and projection, as well as image analysis.
The miniature laser projector shown is made by Symbol Technologies Inc., USA (http://www.symbol.com/products/oem/lpd.htmlftl) .
In the application of Figures Ia and Ib, the user has a map M, the position of the imaging of which is input to the application (not shown) . This can take place either by entering me- chanically the values of the detectable co-ordinates, or even by reading such information directly from the map, with the aid of OCR recognition. In Figure Ia, once the user has started the application, it takes an image of the map, which it stores as a bit image in the memory. On the other hand, the absolute position of the device 10 can be identified using a GPS-positioning device. The application calculates the position data to the bit image of the image, from which it is possible in turn to calculate the parameters for the optical indicator, in this case for the control of the laser projector. According to Figure Ib, the device shows the position "You are here' on the map M, with the aid of a laser dot, square, or other similar mark.
The above embodiment can be adapted, for instance in such a way that, instead of the GPS positioning, the local street name is entered and, on the other hand texts, i.e. character strings are detected from the bit image using OCR technology, and are stored with the position data. The input is compared with the OCR-recognized texts, in which case a hit will give the position data of the street in question. With the aid of this, the projection parameters are calculated as above.
In Figures 2a and 2b, a task is resolved, in which a telephone number is sought from a dense list of names L. According to Figure 2a, an image is taken of the list of names and stored as a bit image in the memory. The application can then perform effective OCR recognition and store and the names and telephone numbers as character strings, as well as the position in the bit image of each character string. The selected name is given as input, which the application retrieves from the
stored character strings. The character string according to the hit also gives the position data, by means of which the control parameters are defined, for showing the laser image (for example, a rectangle) at the name.
In preferred examples, the precision of the projection can be increased by re-imaging the subject after analysis, which retrieves the change in position of the camera immediately prior to projecting. An approximate and thus rapid image analysis is sufficient for this.
A virtual ruler is shown in Figure 3.
When the shapes of the objects are then detected in the field of vision of the camera, it is possible to calculate how a ruler would be positioned on the planar surfaces of the shapes and a ruler can be projected onto the surface. If information on the orientation of the camera is added to this, the ruler
BP can be set on the surface either horizontally or verti- cally, as required. The orientation information is obtained either from the rest of the structure, or by using an angle sensor 22 contained in the device.
The fact that the projector and camera can never be placed at the same point, as can be seen from Figure 4, leads to a certain problem. In the device 10, the lenses of the camera 16 and the projector 20 are at a distance from each other. The camera 16 has an angle of vision 16' while the projector 20 has an uncorrected projection angle 20'. The corrected projec- tion angle 20" strikes the same area in the subject M only at a specific distance. Thus, it is useful to measure the distance, in order to increase the precision of the projection. According to the measured distance, the projection can be rotated so that the centre of projection and the camera's imag- ing centre coincide. With the aid of the distance measurement,
the power of the laser projector can also be adjusted optimally. The projected dot or image can be exploited in the distance measurement. The distance of the subject can be measured with the aid of a simple geometric equation, if the projection angle and imaging angle (from the image positions) and the geometry of the device are known.
The operation of the device takes place with the aid of an application program arranged on top of the operating system. The flow chart of Figure 5 shows the operation of Figures 2a and 2b. Once the program has started it branches into an imaging function 100 and an input function 101, in which the user is requested for a name from list that is displayed. In the imaging function 100, the user is expected to aim the camera at the list and trigger imaging, in which case the image that is created is stored in the device's memory as a bit image. This is followed by text recognition, stage 102, in which OCR technology is used to search for written characters, i.e. lines of text (or other agreed patterns) , from the bit image stored in the memory area. The detected character strings and the bit positions corresponding to them are stored.
After this, the name input is compared with the text in the text lines, stage 103. This is a simple text-search routine. In connection with a hit, the position values corresponding to the hit are also stored. These values are used to define the orientation parameters of the laser projector, stage 104.
Finally, a command is given to the laser projector to project a preselected pattern onto the subject, stage 105.
In an improved variation, before projection the angle position of the subject is detected using approximate and thus rapid optical detection. This is used to correct the orientation pa-
rameters, if the device has turned relative to the original imaging position.
The method according to the invention can be implemented using a program product in a device, which comprises a computer with memories and an I/O element. The program product comprises a computer-readable memory medium, in which there are computer- readable program components, which are characterized in that the said components comprise - a first component to be run programmatically for controlling the camera for taking an image and for storing the image obtained as a bit image of a selected size in a selected memory area of the said memory, a second component to be run programmatically for analys- ing the said bit image in order to find a selected object image from the bit image and for storing the bit positions of the found object image, and a third component to be run programmatically for calculating the orientation values form the said bit positions and for transmitting them to the said control unit, in order to orientate the optical indicator according to the found position of the object image.
In one embodiment, the said second program-code component con- tains a routine for entering text describing the object, and a routine for performing OCR text recognition in order to find text objects from the bit image and at the same time for storing the positions of the found text objects, and - a routine for comparing the text describing the object with the detected text objects and for picking the bit positions of the found text object for use in the said third component to be performed.
Claims
1. Method for visually indicating a selected object (P) in a desired subject (M) using a device (10) equipped with a camera (16) , in which an image is taken of the subject, and is analysed in order to find the occurrence of the selected object (P) from the bit image, characterized in that the device (10) is equipped with an orientation-controllable optical indicator (20), and the projection of the occurrence of the object (P) in the image and according to it the orientation of the indicator (B) directed to the object (P) are defined, on the basis of which definition the object (P) is indicated in the subject (M) using the optical indicator (B) .
2. Method according to Claim 1 for finding a position (P) on a map (M) , characterized in that the device (10) is equipped with positioning means (GPS) , in which case the geographical location of the map (M) is detected with the aid of analysis, according to which positioning data the projection of one's own position (P) and the corresponding orientation data of the optical indicator (B) are calculated.
3. Method according to Claim 1 for finding one's own position on a map, characterized in that information depicting one's own position is given as an object, which is optically detected from the subject formed by the map.
4. Method according to Claim 1 for finding a selected text ob- ject from a subject, characterized in that the analysis com- prises OCR text recognition, in which case the text objects of the subject are detected while at the same time defining their position data inside the subject and comparing each found text object with the selected text objects, when the orientation data of the optical indicator are defined according to the found text object.
5. Method according to Claim 1 for creating a virtual ruler in a subject formed by a selected surface, characterized in that a set of co-ordinates according to the device are selected and the optical plane (BP) at the selected angle is indicated according to it.
6. Method according to Claim 5 , characterized in that the device is equipped with an angle sensor (22) for determining an absolute set of co-ordinates.
7. Method according to any of Claims 1 - 6, characterized in that the optical indicator is a miniature-type laser projector, by means of which a dot, line, or other similar geometri- cal pattern is indicator in the subject.
8. Device for visually indicating a selected object in a desired subject, which device includes a camera and image analysis means for searching for the selected object from an image, characterized in that the device also includes an orientation- controllable optical indicator with control means, in which case on the basis of the definition data of the position of the occurrence of the object the control means are arranged to define the orientation direction of the indicator and accord- ingly indicate the object visually using the said indicator.
9. Device according to Claim 8 for finding one's own position on a map, characterized in that the device also comprises positioning means (GPS) and the analysis means are arranged to detected the geographical position of the map from the map image and the control means are arranged to define the corresponding data of the optical indicator on the map.
10. Device according to Claim 8 or 9 , characterized in that it includes OCR text-recognition means for recognizing text ob- jects from the subject while at the same time defining their position data and means for comparing each found text object with the selected text object and thus for defining the orientation data of the optical indicator.
11. Device according to Claim 8 for creating a virtual ruler in a subject formed by a selected surface, characterized in that the device comprises an angle sensor and control means are arranged to be controlled using the said angle sensor.
12. Device according to any of Claims 8 - 11, characterized in that the optical indicator is a miniature-type angle sensor.
13. Program product for implementing the method according to Claim 1 in a device comprising a computer with memories and an
I/O element, which program product comprises a computer- readable memory medium, in which computer-readable program components are stored, characterized in that the said components comprise - a first component to be run programmatically for controlling the camera for taking an image and for storing the image obtained as a bit image of a selected size in a selected memory area of the said memory, a second component to be run programmatically for analysing the said bit image in order to find a selected object image from the bit image and for storing the bit positions of the found object image, and a third component to be run programmatically for calculating the orientation values form the said bit po- sitions and for transmitting them to the said control unit, in order to orientate the optical indicator according to the found position of the object image.
14. Program product according to Claim 13 , characterized in that the said second program-code component contains a routine for entering text describing the object, and - a routine for performing OCR text recognition in order to find text objects from the bit image and at the same time for storing the positions of the found text objects, and a routine for comparing the text describing the ob- ject with the detected text objects and for picking the bit positions of the found text object for use in the said third component to be performed.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP06764518A EP1896897B1 (en) | 2005-06-29 | 2006-06-26 | Method, device, and program product for visually indicating a selected object, using a device equipped with camera |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
FI20055365 | 2005-06-29 | ||
FI20055365A FI20055365A (en) | 2005-06-29 | 2005-06-29 | Method, device and program product for displaying a selected object visually with a camera-equipped device |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2007000491A1 true WO2007000491A1 (en) | 2007-01-04 |
Family
ID=34778501
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/FI2006/050279 WO2007000491A1 (en) | 2005-06-29 | 2006-06-26 | Method, device, and program product for visually indicating a selected object, using a device equipped with camera |
Country Status (3)
Country | Link |
---|---|
EP (1) | EP1896897B1 (en) |
FI (1) | FI20055365A (en) |
WO (1) | WO2007000491A1 (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9118832B2 (en) | 2010-08-17 | 2015-08-25 | Nokia Technologies Oy | Input method |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE19800447A1 (en) * | 1997-01-08 | 1998-07-09 | Asahi Optical Co Ltd | Camera especially for photogrammetry measurements |
DE29920452U1 (en) * | 1999-11-20 | 2000-07-13 | Donndorf, Siegfried, 79114 Freiburg | Camera that has a laser pointer |
US20020028002A1 (en) * | 1997-03-03 | 2002-03-07 | Whited Keith W. | System and method for storage, retrieval and display of information relating to specimens in marine environments |
JP2002369189A (en) * | 2001-06-06 | 2002-12-20 | Ffc:Kk | Camera image display equipment and method |
JP2003005628A (en) * | 2001-06-20 | 2003-01-08 | Mitsubishi Electric Corp | Photograph image processor |
EP1453297A1 (en) * | 2000-11-08 | 2004-09-01 | Xerox Corporation | Method and apparatus for indicating a field of view for a document camera |
JP2004364067A (en) * | 2003-06-06 | 2004-12-24 | Nippon Telegr & Teleph Corp <Ntt> | Camera with indicator |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE19912254A1 (en) * | 1999-03-18 | 2000-09-21 | Tom Faehrmann | Device for indicating the edges of a picture or scene that is being filmed or shot on camera, has a projector that produces a laser beam that adjusts to the camera image format and focal length to outline the scene being shot |
DE20117201U1 (en) * | 2001-05-12 | 2002-09-19 | Ahrens, Hans-Joachim, 38855 Wernigerode | Mobile phone with projection device |
AU2003260321A1 (en) * | 2002-09-06 | 2004-03-29 | Sony Ericsson Mobile Communications Ab | A portable electronic communications device provided with a projector |
-
2005
- 2005-06-29 FI FI20055365A patent/FI20055365A/en not_active Application Discontinuation
-
2006
- 2006-06-26 WO PCT/FI2006/050279 patent/WO2007000491A1/en not_active Application Discontinuation
- 2006-06-26 EP EP06764518A patent/EP1896897B1/en not_active Not-in-force
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE19800447A1 (en) * | 1997-01-08 | 1998-07-09 | Asahi Optical Co Ltd | Camera especially for photogrammetry measurements |
US20020028002A1 (en) * | 1997-03-03 | 2002-03-07 | Whited Keith W. | System and method for storage, retrieval and display of information relating to specimens in marine environments |
DE29920452U1 (en) * | 1999-11-20 | 2000-07-13 | Donndorf, Siegfried, 79114 Freiburg | Camera that has a laser pointer |
EP1453297A1 (en) * | 2000-11-08 | 2004-09-01 | Xerox Corporation | Method and apparatus for indicating a field of view for a document camera |
JP2002369189A (en) * | 2001-06-06 | 2002-12-20 | Ffc:Kk | Camera image display equipment and method |
JP2003005628A (en) * | 2001-06-20 | 2003-01-08 | Mitsubishi Electric Corp | Photograph image processor |
JP2004364067A (en) * | 2003-06-06 | 2004-12-24 | Nippon Telegr & Teleph Corp <Ntt> | Camera with indicator |
Non-Patent Citations (4)
Title |
---|
PATENT ABSTRACTS OF JAPAN vol. 2003, no. 04 2 April 2003 (2003-04-02) * |
PATENT ABSTRACTS OF JAPAN vol. 2003, no. 05 12 May 2003 (2003-05-12) * |
PATENT ABSTRACTS OF JAPAN vol. 2003, no. 12 5 December 2003 (2003-12-05) * |
See also references of EP1896897A4 * |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9118832B2 (en) | 2010-08-17 | 2015-08-25 | Nokia Technologies Oy | Input method |
US10122925B2 (en) | 2010-08-17 | 2018-11-06 | Nokia Technologies Oy | Method, apparatus, and computer program product for capturing image data |
Also Published As
Publication number | Publication date |
---|---|
EP1896897B1 (en) | 2012-04-04 |
EP1896897A1 (en) | 2008-03-12 |
FI20055365A (en) | 2006-12-30 |
FI20055365A0 (en) | 2005-06-29 |
EP1896897A4 (en) | 2010-07-28 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10402956B2 (en) | Image-stitching for dimensioning | |
US9448758B2 (en) | Projecting airplane location specific maintenance history using optical reference points | |
EP2745504B1 (en) | Image projector, image processing method, computer program and recording medium | |
JP3830956B1 (en) | Information output device | |
CN102985789B (en) | Target point recognition method and surveying instrument | |
KR100682960B1 (en) | Straight ruler using laser and method for measuring length and projecting line using the straight ruler | |
KR101606444B1 (en) | Method for providing target point candidates for selecting a target point | |
US20070097381A1 (en) | Hand-size structured-light three-dimensional metrology imaging system and method | |
US20150369593A1 (en) | Orthographic image capture system | |
JP2006520891A (en) | Method and apparatus for image processing in surveying instrument | |
CN111397586B (en) | Measurement system and method for verifying pre-configured target attributes using the same | |
US20170026636A1 (en) | Method for the positionally accurate projection of a mark onto an object, and projection apparatus | |
WO2021117793A1 (en) | Survey system and survey method | |
JP2005509877A (en) | Computer vision system calibration method and system | |
KR101925289B1 (en) | Method and apparatus for identifying location/angle of terminal | |
US11321864B1 (en) | User guided mode for measurement purposes | |
EP1896897B1 (en) | Method, device, and program product for visually indicating a selected object, using a device equipped with camera | |
JP2006331214A (en) | Object identification tag and object identification system using it | |
JP4359083B2 (en) | Surveying system | |
JP2014102183A (en) | Image processing apparatus and image processing system | |
JP7044331B2 (en) | Image processing systems, image processing methods and programs for efficiently inspecting structures such as bridges | |
JP2014071818A (en) | Two-dimensional code reader and two-dimensional code reading method | |
KR101940736B1 (en) | Sketch smart calculator | |
JP2020197495A (en) | Information processing apparatus, measuring device, information processing method, program, system, and method for manufacturing article | |
JP2002286425A (en) | Displacement sensor |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
WWE | Wipo information: entry into national phase |
Ref document number: 2006764518 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 9589/DELNP/2007 Country of ref document: IN |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWW | Wipo information: withdrawn in national office |
Ref document number: DE |
|
WWP | Wipo information: published in national office |
Ref document number: 2006764518 Country of ref document: EP |