CN112799525B - Optical navigation auxiliary system - Google Patents
Optical navigation auxiliary system Download PDFInfo
- Publication number
- CN112799525B CN112799525B CN202110115136.7A CN202110115136A CN112799525B CN 112799525 B CN112799525 B CN 112799525B CN 202110115136 A CN202110115136 A CN 202110115136A CN 112799525 B CN112799525 B CN 112799525B
- Authority
- CN
- China
- Prior art keywords
- image
- lattice
- characteristic
- auxiliary
- dot matrix
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0354—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
- G06F3/03543—Mice or pucks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/038—Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Position Input By Displaying (AREA)
Abstract
The invention provides an optical navigation auxiliary system, which is arranged in a mouse and comprises a light-emitting unit, an image sensor and a calculation processing unit, wherein the light-emitting unit emits illumination light, the illumination light is reflected to the image sensor through a plane, the image sensor acquires images at regular intervals, the calculation processing unit divides the acquired images into a reference frame image and a comparison image, respectively extracts characteristic lattices in the images, obtains movement data of the mouse by comparing the relative displacement of the characteristic lattices, and in addition, the calculation processing unit also calibrates the movement data by extracting an auxiliary lattice, and the system is also provided with a plane unit capable of correcting accumulated errors. The characteristic value is obtained from the graph in a mathematical mode, the characteristics in the two graphs are compared in real time, meanwhile, the reference frame graph is replaced under the condition that a certain condition is met, the error of optical navigation is reduced, and the trouble of manual calibration is avoided by equipping a special plane unit.
Description
Technical Field
The invention relates to the technical field of mouse positioning, in particular to an optical navigation auxiliary system.
Background
The mouse is used as the most common input device and plays an important role in the use of a computer, and the navigation device in the mouse is changed from a mechanical rolling ball to optical navigation in the development process, so that the excellent optical navigation auxiliary system can enable the operation feeling of the mouse to be more comfortable and accurate, and reduce the redundant operation of lifting the mouse for repositioning.
Now, many optical navigation assistance systems have been developed, and through a lot of search and reference, it is found that the existing systems are disclosed as KR1020140123520A, KR1020120122968A and KR1020120121863A, which include a light emitting unit, an optical navigation chip and a housing, wherein the light emitting unit is used for providing a light beam to irradiate a surface of a displacement generating unit, the optical navigation chip includes a sensing array but does not have an optical lens for focusing a reflected light beam, the sensing array is disposed opposite to the surface of the displacement generating unit and is used for receiving the reflected light beam generated by the light beam provided by the light emitting unit after being reflected by the surface of the displacement generating unit, the housing has a first surface and a second surface, and an angle is formed between the housing and the optical navigation chip so as to prevent the light beam from being reflected to the sensing array of the optical navigation chip through the first surface of the housing. However, the system accumulates errors during operation and needs to be repositioned frequently over time.
Disclosure of Invention
The invention aims to provide an optical navigation auxiliary system aiming at the defects,
in order to overcome the defects of the prior art, the invention adopts the following technical scheme:
an optical navigation auxiliary system is arranged in a mouse and comprises a light-emitting unit, an image sensor and a calculation processing unit, wherein the light-emitting unit emits illumination light, the illumination light is reflected to the image sensor through a plane, the image sensor acquires images at regular intervals, and the calculation processing unit calculates and processes the acquired images to acquire the moving distance and direction of the mouse;
further, the calculation processing unit takes a first frame image acquired by the image sensor as a reference frame image, takes a subsequently acquired image as a comparison image, and determines the movement of the mouse by comparing the reference frame image and the comparison image;
further, the calculation processing unit analyzes the reference frame image to obtain a feature lattice located in the middle area, the calculation processing unit analyzes the comparison image to obtain the feature lattice, and the movement of the mouse is determined by using the relative displacement of the feature lattice;
further, the calculation processing unit analyzes the reference frame image and the comparison image to respectively obtain an auxiliary lattice, and corrects errors of the direction and the moving distance by using the position relationship between the auxiliary lattice and the feature lattice;
further, when the feature lattice of the comparison image is located outside the middle area, the comparison image is used as a new reference frame image and the feature lattice in the middle area is obtained again;
furthermore, the system also comprises a laser range finder for measuring the distance between the bottom surface of the mouse and the placing plane, and the processing of the image by the computing processing unit is started or ended according to the change of the distance;
the optical navigation auxiliary system further comprises a plane unit, wherein a dot matrix formed by identification points which are uniformly distributed is arranged on the plane unit, the distance between the adjacent identification points is half of the side length of an image obtained when the image sensor is closest to the plane unit, a sub-point is arranged in the middle of each identification point, when the image sensor obtains the image of the plane unit irradiated by the light-emitting unit, the image is processed by the calculation processing unit, the identification points are identified as characteristic dot matrixes, and the sub-points are identified as auxiliary dot matrixes;
further, ID information is arranged around the identification point, the ID information has uniqueness, and the calculation processing unit corrects the accumulated movement error through the identification ID information;
further, the ID information includes a ratio of a distance from a corresponding identification point to a left edge of the planar unit to a right edge of the planar unit and a ratio of a distance from the corresponding identification point to an upper edge of the planar unit to a lower edge of the planar unit.
The beneficial effects obtained by the invention are as follows:
the system determines the direction and the distance of the moving distance by extracting and comparing the characteristic data in the reference frame image and the comparison image, and simultaneously replaces the reference frame image under a certain condition, so that the error is reduced in the calculation process.
Drawings
The invention will be further understood from the following description in conjunction with the accompanying drawings. The components in the figures are not necessarily to scale, emphasis instead being placed upon illustrating the principles of the embodiments. Like reference numerals designate corresponding parts throughout the different views.
Fig. 1 is a schematic flow chart of the system for outputting movement data.
FIG. 2 is a comparison diagram of a reference frame image and a comparison image.
FIG. 3 is a schematic diagram of a feature lattice and an auxiliary lattice.
Fig. 4 is a schematic view of an identification point ID area.
Fig. 5 is a schematic cross-sectional view of a planar unit.
FIG. 6 is a schematic diagram illustrating the effect of error correction.
Detailed Description
In order to make the objects and advantages of the present invention more apparent, the present invention will be further described in detail with reference to the following embodiments; it should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention. Other systems, methods, and/or features of the present embodiments will become apparent to those skilled in the art upon review of the following detailed description. It is intended that all such additional systems, methods, features and advantages be included within this description, be within the scope of the invention, and be protected by the accompanying claims. Additional features of the disclosed embodiments are described in, and will be apparent from, the detailed description that follows.
The same or similar reference numerals in the drawings of the embodiments of the present invention correspond to the same or similar components; in the description of the present invention, it should be understood that if there is an orientation or positional relationship indicated by terms such as "upper", "lower", "left", "right", etc., based on the orientation or positional relationship shown in the drawings, it is only for convenience of description and simplification of description, but it is not indicated or implied that the device or component referred to must have a specific orientation, be constructed and operated in a specific orientation, and therefore, the terms describing the positional relationship in the drawings are only used for illustrative purposes and are not to be construed as limitations of the present patent, and specific meanings of the terms may be understood by those skilled in the art according to specific situations.
The first embodiment.
An optical navigation auxiliary system is arranged in a mouse and comprises a light-emitting unit, an image sensor and a calculation processing unit, wherein the light-emitting unit emits illumination light, the illumination light is reflected to the image sensor through a plane, the image sensor acquires images at regular intervals, and the calculation processing unit calculates and processes the acquired images to acquire the moving distance and direction of the mouse;
the calculation processing unit takes a first frame image acquired by the image sensor as a reference frame image, takes a subsequently acquired image as a comparison image, and determines the movement of the mouse by comparing the reference frame image with the comparison image;
the computing processing unit analyzes the reference frame image to obtain a characteristic dot matrix located in the middle area, analyzes the contrast image to obtain the characteristic dot matrix, and determines the movement of the mouse by using the relative displacement of the characteristic dot matrix;
the calculation processing unit analyzes the reference frame image and the comparison image to respectively obtain an auxiliary dot matrix, and corrects errors of the direction and the moving distance by using the position relation between the auxiliary dot matrix and the characteristic dot matrix;
when the feature lattice of the comparison graph is located outside the middle area, the comparison graph is used as a new reference frame graph, and the feature lattice in the middle area is obtained again;
the system also comprises a laser range finder for measuring the distance between the bottom surface of the mouse and the placing plane and starting or ending the processing of the image by the computing and processing unit according to the change of the distance;
the optical navigation auxiliary system further comprises a plane unit, wherein a dot matrix formed by identification points which are uniformly distributed is arranged on the plane unit, the distance between the adjacent identification points is half of the side length of an image obtained when the image sensor is closest to the plane unit, a sub-point is arranged in the middle of each identification point, when the image sensor obtains the image of the plane unit irradiated by the light-emitting unit, the image is processed by the computing processing unit, the identification points are identified as characteristic dot matrixes, and the sub-points are identified as auxiliary dot matrixes;
ID information is arranged around the identification point, the ID information has uniqueness, and the calculation processing unit corrects the accumulated movement error through the identification ID information;
the ID information includes a ratio of a distance from its corresponding identification point to a left edge of the planar unit to a right edge of the planar unit and a ratio of a distance from the corresponding identification point to an upper edge of the planar unit to a lower edge of the planar unit.
Example two.
An optical navigation auxiliary system is arranged in a mouse and comprises a light-emitting unit, an image sensor and a calculation processing unit, wherein the light-emitting unit emits illumination light, the illumination light is reflected to the image sensor through a plane, the image sensor acquires images at regular intervals, and the calculation processing unit calculates and processes the acquired images to acquire the moving distance and direction of the mouse;
the calculation processing unit takes a first frame image acquired by the image sensor as a reference frame image, takes a subsequently acquired image as a comparison image, and determines the movement of the mouse by comparing the reference frame image with the comparison image;
the computing processing unit analyzes the reference frame image to obtain a characteristic dot matrix located in the middle area, analyzes the contrast image to obtain the characteristic dot matrix, and determines the movement of the mouse by using the relative displacement of the characteristic dot matrix;
the calculation processing unit analyzes the reference frame image and the comparison image to respectively obtain an auxiliary dot matrix, and corrects errors of the direction and the movement distance by using the position relation of the auxiliary dot matrix and the characteristic dot matrix;
when the feature lattice of the comparison graph is located outside the middle area, the comparison graph is used as a new reference frame graph, and the feature lattice in the middle area is obtained again;
the system also comprises a laser range finder for measuring the distance between the bottom surface of the mouse and the placing plane and starting or ending the processing of the image by the computing and processing unit according to the change of the distance;
the optical navigation auxiliary system further comprises a plane unit, wherein a dot matrix formed by identification points which are uniformly distributed is arranged on the plane unit, the distance between the adjacent identification points is half of the side length of an image obtained when the image sensor is closest to the plane unit, a sub-point is arranged in the middle of each identification point, when the image sensor obtains the image of the plane unit irradiated by the light-emitting unit, the image is processed by the computing processing unit, the identification points are identified as characteristic dot matrixes, and the sub-points are identified as auxiliary dot matrixes;
ID information is arranged around the identification point, the ID information has uniqueness, and the calculation processing unit corrects the accumulated movement error through the identification ID information;
the ID information comprises the ratio of the distance from the corresponding identification point to the left edge of the plane unit to the distance from the corresponding identification point to the right edge of the plane unit and the ratio of the distance from the corresponding identification point to the upper edge of the plane unit to the distance from the corresponding identification point to the lower edge of the plane unit;
based on the design, the optical navigation auxiliary system is arranged in a mouse and comprises a light-emitting unit, an image sensor and a calculation processing unit, wherein the light-emitting unit emits illumination light, the illumination light is reflected to the image sensor through a plane, the image sensor acquires images at regular intervals, and the calculation processing unit calculates and processes the acquired images to obtain the moving distance and direction of the mouse;
the calculation processing unit processes the image according to the following steps:
s1, saving the first frame image acquired by the image sensor as a reference frame image;
s2, taking the middle area of the reference frame image as a reference area;
s3, extracting gray values of all pixel points in the reference frame image for characterization, and storing a lattice with the maximum characteristic value in the reference area as a characteristic lattice;
s4, taking the lattice with the second largest characteristic value in the reference area as an auxiliary lattice for storage, calculating the distance between the characteristic lattice and the auxiliary lattice, and taking the connection direction of the characteristic lattice and the auxiliary lattice as a reference direction;
s5, saving the next frame image acquired by the image sensor as a comparison image;
s6, extracting gray values of all pixel points of the comparison graph;
s7, performing characterization processing on all pixel points of the comparison graph, and screening out a dot matrix of which the characteristic value is greater than a threshold value;
s8, calculating the position relation and the characteristic value difference of the screened dot matrixes, matching the position relation and the characteristic value difference with the characteristic dot matrixes and the auxiliary dot matrixes of the reference frame image, and selecting two dot matrixes which accord with the matching result, wherein the dot matrix with a larger characteristic value is the characteristic dot matrix, and the dot matrix with a smaller characteristic value is the auxiliary dot matrix;
s9, connecting the characteristic dot matrix and the auxiliary dot matrix of the comparison graph to obtain a connecting line, and if the connecting line has deviation from the reference direction of the reference frame graph, performing rotation adjustment by taking the center of the comparison graph as a rotation point to enable the connecting line to be consistent with the reference direction;
s10, calculating the distance between the feature lattice and the auxiliary lattice in the contrast image, and taking the ratio of the distance between the feature lattice and the auxiliary lattice in the reference frame image as a scaling adjustment value;
s11, calculating the relative displacement of the characteristic lattice in the reference frame image and the characteristic lattice in the comparison image, and then adjusting the relative displacement by the scaling adjustment value to be used as the output value of the optical navigation, wherein the relative displacement comprises the moving distance and direction;
s12, if the feature lattice of the comparison graph is located in the middle area, jumping to the step S5;
s13, if the feature lattice of the comparison graph is located outside the middle area, the comparison graph is used as a new reference frame graph to be stored, and the step S2 is skipped;
middle area of image in the above stepThe domain means: if the resolution of the original image is n x n, the abscissa of the original image isBetween, on the ordinateThe collection formed by the pixel points is a middle area, and the characteristic lattice and the auxiliary lattice are both 3-by-3 lattices;
the method of the characterization processing in the steps S3, S7 is implemented according to the following steps:
s21, counting the number of all pixel points with the same gray value, and taking the gray value of the pixel point with the largest number as the background gray value;
s22, subtracting the background gray value from the gray values of all the pixel points to serve as a new gray value to be stored;
s23, calculating the difference value Z of each non-edge pixel:
wherein, H () is the gray value of the pixel point, and (i, j) is the coordinate of the pixel point;
s24, calculating the characteristic value Y of each non-edge pixel:
and the characteristic value of the 3 x 3 lattice is the average value of the characteristic values of nine pixel points in the lattice.
Example three.
An optical navigation auxiliary system is arranged in a mouse and comprises a light-emitting unit, an image sensor and a calculation processing unit, wherein the light-emitting unit emits illumination light, the illumination light is reflected to the image sensor through a plane, the image sensor acquires images at regular intervals, and the calculation processing unit calculates and processes the acquired images to acquire the moving distance and direction of the mouse;
the calculation processing unit takes a first frame image acquired by the image sensor as a reference frame image, takes a subsequently acquired image as a comparison image, and determines the movement of the mouse by comparing the reference frame image with the comparison image;
the computing processing unit analyzes the reference frame image to obtain a characteristic dot matrix located in the middle area, analyzes the contrast image to obtain the characteristic dot matrix, and determines the movement of the mouse by using the relative displacement of the characteristic dot matrix;
the calculation processing unit analyzes the reference frame image and the comparison image to respectively obtain an auxiliary dot matrix, and corrects errors of the direction and the movement distance by using the position relation of the auxiliary dot matrix and the characteristic dot matrix;
when the feature lattice of the comparison graph is located outside the middle area, the comparison graph is used as a new reference frame graph, and the feature lattice in the middle area is obtained again;
the system also comprises a laser range finder for measuring the distance between the bottom surface of the mouse and the placing plane and starting or ending the processing of the image by the computing and processing unit according to the change of the distance;
the optical navigation auxiliary system further comprises a plane unit, wherein a dot matrix formed by identification points which are uniformly distributed is arranged on the plane unit, the distance between the adjacent identification points is half of the side length of an image obtained when the image sensor is closest to the plane unit, a sub-point is arranged in the middle of each identification point, when the image sensor obtains the image of the plane unit irradiated by the light-emitting unit, the image is processed by the computing processing unit, the identification points are identified as characteristic dot matrixes, and the sub-points are identified as auxiliary dot matrixes;
ID information is arranged around the identification point, the ID information has uniqueness, and the calculation processing unit corrects the accumulated movement error through the identification ID information;
the ID information comprises the ratio of the distance from the corresponding identification point to the left edge of the plane unit to the distance from the corresponding identification point to the right edge of the plane unit and the ratio of the distance from the corresponding identification point to the upper edge of the plane unit to the distance from the corresponding identification point to the lower edge of the plane unit;
based on the design, the optical navigation auxiliary system is arranged in a mouse and comprises a light-emitting unit, an image sensor and a calculation processing unit, wherein the light-emitting unit emits illumination light, the illumination light is reflected to the image sensor through a plane, the image sensor acquires images at regular intervals, and the calculation processing unit calculates and processes the acquired images to obtain the moving distance and direction of the mouse;
the calculation processing unit processes the image according to the following steps:
s1, saving the first frame image acquired by the image sensor as a reference frame image;
s2, taking the middle area of the reference frame image as a reference area;
s3, extracting gray values of all pixel points in the reference frame image for characterization, and storing a lattice with the maximum characteristic value in the reference area as a characteristic lattice;
s4, storing the lattice with the second largest characteristic value in the reference area as an auxiliary lattice, calculating the distance between the characteristic lattice and the auxiliary lattice, and taking the connection direction of the characteristic lattice and the auxiliary lattice as a reference direction;
s5, saving the next frame image acquired by the image sensor as a comparison image;
s6, extracting gray values of all pixel points of the contrast map;
s7, performing characterization processing on all pixel points of the comparison graph, and screening out a dot matrix of which the characteristic value is greater than a threshold value;
s8, calculating the position relation and the characteristic value difference of the screened dot matrixes, matching the position relation and the characteristic value difference with the characteristic dot matrixes and the auxiliary dot matrixes of the reference frame image, and selecting two dot matrixes which accord with the matching result, wherein the dot matrix with a larger characteristic value is the characteristic dot matrix, and the dot matrix with a smaller characteristic value is the auxiliary dot matrix;
s9, connecting the characteristic dot matrix and the auxiliary dot matrix of the comparison graph to obtain a connecting line, and if the connecting line has deviation from the reference direction of the reference frame graph, performing rotation adjustment by taking the center of the comparison graph as a rotation point to enable the connecting line to be consistent with the reference direction;
s10, calculating the distance between the feature lattice and the auxiliary lattice in the contrast image, and taking the ratio of the distance between the feature lattice and the auxiliary lattice in the reference frame image as a scaling adjustment value;
s11, calculating the relative displacement of the characteristic lattice in the reference frame image and the characteristic lattice in the comparison image, and then adjusting the relative displacement by the scaling adjustment value to be used as the output value of the optical navigation, wherein the relative displacement comprises the moving distance and direction;
s12, if the feature lattice of the comparison graph is located in the middle area, jumping to the step S5;
s13, if the feature lattice of the comparison graph is located outside the middle area, the comparison graph is used as a new reference frame graph to be stored, and the step S2 is skipped;
the middle area of the image in the above step means: if the resolution of the original image is n x n, the abscissa of the original image is inBetween, on the ordinateThe collection formed by the pixel points is a middle area, and the characteristic lattice and the auxiliary lattice are both 3-by-3 lattices;
the method of the characterization processing in the steps S3, S7 is implemented according to the following steps:
s21, counting the number of all pixel points with the same gray value, and taking the gray value of the pixel point with the largest number as the background gray value;
s22, subtracting the background gray value from the gray values of all the pixel points to serve as a new gray value to be stored;
s23, calculating the difference value Z of each non-edge pixel:
wherein, H () is the gray value of the pixel point, and (i, j) is the coordinate of the pixel point;
s24, calculating the characteristic value Y of each non-edge pixel:
the characteristic value of the 3 x 3 lattice is the average value of the characteristic values of nine pixel points in the lattice;
the optical navigation auxiliary system is also provided with a laser range finder, when the distance between the bottom surface of the mouse and the placing plane detected by the laser range finder exceeds a threshold value, the processing flow of the image by the calculation processing unit is ended, and when the detected distance is smaller than the threshold value, the calculation processing unit starts processing the image from the step S1 again;
the optical navigation auxiliary system further comprises a plane unit, wherein a dot matrix formed by identification points which are uniformly distributed is arranged on the plane unit, the distance between the adjacent identification points is half of the side length of an image obtained when the image sensor is closest to the plane unit, an auxiliary point is arranged in the middle of each identification point, when the image sensor obtains the image of the plane unit irradiated by the light-emitting unit, after the image is characterized, a 3 dot matrix at the center of a pixel point where the identification point is located is judged as a characteristic dot matrix, and a 3 dot matrix at the center of a pixel point where the auxiliary point is located is judged as an auxiliary dot matrix;
in the process of using the mouse at ordinary times, due to the accumulation of errors, even if the mouse moves to the initial position, the position of a corresponding cursor on a display screen is deviated from the initial position, and the mouse needs to be lifted and repositioned as time passes, so that certain interference is caused to a user in the occasion of frequently operating the mouse.
Example four.
An optical navigation auxiliary system is arranged in a mouse and comprises a light-emitting unit, an image sensor and a calculation processing unit, wherein the light-emitting unit emits illumination light, the illumination light is reflected to the image sensor through a plane, the image sensor acquires images at regular intervals, and the calculation processing unit calculates and processes the acquired images to acquire the moving distance and direction of the mouse;
the calculation processing unit takes a first frame image acquired by the image sensor as a reference frame image, takes a subsequently acquired image as a comparison image, and determines the movement of the mouse by comparing the reference frame image with the comparison image;
the computing processing unit analyzes the reference frame image to obtain a characteristic dot matrix located in the middle area, analyzes the contrast image to obtain the characteristic dot matrix, and determines the movement of the mouse by using the relative displacement of the characteristic dot matrix;
the calculation processing unit analyzes the reference frame image and the comparison image to respectively obtain an auxiliary dot matrix, and corrects errors of the direction and the moving distance by using the position relation between the auxiliary dot matrix and the characteristic dot matrix;
when the feature lattice of the comparison graph is located outside the middle area, the comparison graph is used as a new reference frame graph, and the feature lattice in the middle area is obtained again;
the system also comprises a laser range finder for measuring the distance between the bottom surface of the mouse and the placing plane and starting or ending the processing of the image by the computing and processing unit according to the change of the distance;
the optical navigation auxiliary system further comprises a plane unit, wherein a dot matrix formed by identification points which are uniformly distributed is arranged on the plane unit, the distance between the adjacent identification points is half of the side length of an image obtained when the image sensor is closest to the plane unit, a sub-point is arranged in the middle of each identification point, when the image sensor obtains the image of the plane unit irradiated by the light-emitting unit, the image is processed by the computing processing unit, the identification points are identified as characteristic dot matrixes, and the sub-points are identified as auxiliary dot matrixes;
ID information is arranged around the identification point, the ID information has uniqueness, and the calculation processing unit corrects the accumulated movement error through the identification ID information;
the ID information comprises the ratio of the distance from the corresponding identification point to the left edge of the plane unit to the distance from the corresponding identification point to the right edge of the plane unit and the ratio of the distance from the corresponding identification point to the upper edge of the plane unit to the distance from the corresponding identification point to the lower edge of the plane unit;
based on the design, the optical navigation auxiliary system is arranged in a mouse and comprises a light-emitting unit, an image sensor and a calculation processing unit, wherein the light-emitting unit emits illumination light, the illumination light is reflected to the image sensor through a plane, the image sensor acquires images at regular intervals, and the calculation processing unit calculates and processes the acquired images to obtain the moving distance and direction of the mouse;
the calculation processing unit processes the image according to the following steps:
s1, saving the first frame image acquired by the image sensor as a reference frame image;
s2, taking the middle area of the reference frame image as a reference area;
s3, extracting gray values of all pixel points in the reference frame image for characterization, and storing a lattice with the maximum characteristic value in the reference area as a characteristic lattice;
s4, storing the lattice with the second largest characteristic value in the reference area as an auxiliary lattice, calculating the distance between the characteristic lattice and the auxiliary lattice, and taking the connection direction of the characteristic lattice and the auxiliary lattice as a reference direction;
s5, saving the next frame image acquired by the image sensor as a comparison image;
s6, extracting gray values of all pixel points of the contrast map;
s7, performing characterization processing on all pixel points of the comparison graph, and screening out a dot matrix of which the characteristic value is greater than a threshold value;
s8, calculating the position relation and the characteristic value difference of the screened dot matrixes, matching the position relation and the characteristic value difference with the characteristic dot matrixes and the auxiliary dot matrixes of the reference frame image, and selecting two dot matrixes which accord with the matching result, wherein the dot matrix with a larger characteristic value is the characteristic dot matrix, and the dot matrix with a smaller characteristic value is the auxiliary dot matrix;
s9, connecting the characteristic dot matrix and the auxiliary dot matrix of the comparison graph to obtain a connecting line, and if the connecting line has deviation from the reference direction of the reference frame graph, performing rotation adjustment by taking the center of the comparison graph as a rotation point to enable the connecting line to be consistent with the reference direction;
s10, calculating the distance between the feature lattice and the auxiliary lattice in the contrast image, and taking the ratio of the distance between the feature lattice and the auxiliary lattice in the reference frame image as a scaling adjustment value;
s11, calculating the relative displacement of the characteristic lattice in the reference frame image and the characteristic lattice in the comparison image, and then adjusting the relative displacement by the scaling adjustment value to be used as the output value of the optical navigation, wherein the relative displacement comprises the moving distance and direction;
s12, if the feature lattice of the comparison graph is located in the middle area, jumping to the step S5;
s13, if the feature lattice of the comparison graph is located outside the middle area, the comparison graph is used as a new reference frame graph to be stored, and the step S2 is skipped;
the middle area of the image in the above step means: if the resolution of the original image is n x n, the abscissa of the original image isBetween, on the ordinateThe collection formed by the pixel points is a middle area, and the characteristic lattice and the auxiliary lattice are both 3-by-3 lattices;
the method of the characterization processing in the steps S3, S7 is implemented according to the following steps:
s21, counting the number of all pixel points with the same gray value, and taking the gray value of the pixel point with the largest number as the background gray value;
s22, subtracting the background gray value from the gray values of all the pixel points to serve as a new gray value to be stored;
s23, calculating the difference value Z of each non-edge pixel:
wherein, H () is the gray value of the pixel point, and (i, j) is the coordinate of the pixel point;
s24, calculating the characteristic value Y of each non-edge pixel:
the characteristic value of the 3 x 3 lattice is the average value of the characteristic values of nine pixel points in the lattice;
the optical navigation auxiliary system is also provided with a laser range finder, when the distance between the bottom surface of the mouse and the placing plane detected by the laser range finder exceeds a threshold value, the processing flow of the image by the calculation processing unit is ended, and when the detected distance is smaller than the threshold value, the calculation processing unit starts processing the image from the step S1 again;
the optical navigation auxiliary system further comprises a plane unit, wherein a dot matrix formed by identification points which are uniformly distributed is arranged on the plane unit, the distance between the adjacent identification points is half of the side length of an image obtained when the image sensor is closest to the plane unit, an auxiliary point is arranged in the middle of each identification point, when the image sensor obtains the image of the plane unit irradiated by the light-emitting unit, after the image is characterized, a 3 dot matrix at the center of a pixel point where the identification point is located is judged as a characteristic dot matrix, and a 3 dot matrix at the center of a pixel point where the auxiliary point is located is judged as an auxiliary dot matrix;
in the process of using the mouse at ordinary times, due to accumulation of errors, even if the mouse moves to the initial position, the position of a corresponding cursor on a display screen is deviated from the initial position, and the mouse needs to be lifted and repositioned as time passes, so that certain interference is caused to a user in the occasion of frequently operating the mouse, in order to solve the problem, ID information is arranged around the identification points, the ID information of each identification point is unique, and the computing processing unit identifies the ID information in the image acquired by the image sensor and calibrates the generated errors by using the ID information;
the ID information includes a ratio x of a distance from a corresponding identification point to a left edge of the planar unit to a distance from the corresponding identification point to a right edge of the planar unit, and a ratio y of a distance from the corresponding identification point to an upper edge of the planar unit to a distance from the corresponding identification point to a lower edge of the planar unit, and the manner of recording the information of x and y includes, but is not limited to, the following methods:
the upper side, the lower side, the left side and the right side of each identification point are provided with ID areas of 10 dot-by-10 dot matrixes, pixel points in the ID areas are divided into bright spots and dark spots, the number of the bright spots in the ID areas is set so that the ratio of the number of the bright spots in the ID areas on the left side to the number of the bright spots in the ID areas on the right side is x, and the ratio of the number of the bright spots in the ID areas on the upper side to the number of the bright spots in the ID areas on the lower side is y;
the plane unit includes the three-layer, is bottom, information layer and smooth layer respectively, the information level is located the bottom with between the smooth layer, foretell identification point, vice point and ID are regional all setting in the information layer, smooth layer has the characteristics of level and smooth transparent, the bottom surface is equipped with anti-skidding line, makes plane unit is difficult for removing on the desktop, the information layer comprises the bump that closely links to each other, scribble different colors on the bump and make identification point with vice point can be discerned by the calculation processing unit accuracy, it has fluorescent substance so that with to do better differentiation on the bump that the bright spot in the ID is regional corresponds.
Although the invention has been described above with reference to various embodiments, it should be understood that many changes and modifications may be made without departing from the scope of the invention. That is, the methods, systems, and devices discussed above are examples. Various configurations may omit, substitute, or add various procedures or components as appropriate. For example, in alternative configurations, the methods may be performed in an order different than that described, and/or various components may be added, omitted, and/or combined. Moreover, features described with respect to certain configurations may be combined in various other configurations, as different aspects and elements of the configurations may be combined in a similar manner. Further, elements therein may be updated as technology evolves, i.e., many of the elements are examples and do not limit the scope of the disclosure or claims.
Specific details are given in the description to provide a thorough understanding of the exemplary configurations including implementations. However, configurations may be practiced without these specific details, for example, well-known circuits, processes, algorithms, structures, and techniques have been shown without unnecessary detail in order to avoid obscuring the configurations. This description provides example configurations only, and does not limit the scope, applicability, or configuration of the claims. Rather, the foregoing description of the configurations will provide those skilled in the art with an enabling description for implementing the described techniques. Various changes may be made in the function and arrangement of elements without departing from the spirit or scope of the disclosure.
In conclusion, it is intended that the foregoing detailed description be regarded as illustrative rather than limiting, and that it be understood that these examples are illustrative only and are not intended to limit the scope of the invention. After reading the description of the invention, the skilled person can make various changes or modifications to the invention, and these equivalent changes and modifications also fall into the scope of the invention defined by the claims.
Claims (3)
1. An optical navigation auxiliary system is arranged in a mouse and comprises a light-emitting unit, an image sensor and a calculation processing unit, wherein the light-emitting unit emits illumination light, the illumination light is reflected to the image sensor through a plane, the image sensor acquires images at regular intervals, and the calculation processing unit calculates and processes the acquired images to acquire the moving distance and direction of the mouse;
the calculation processing unit takes a first frame image acquired by the image sensor as a reference frame image, takes a subsequently acquired image as a comparison image, and determines the movement of the mouse by comparing the reference frame image with the comparison image;
the computing processing unit analyzes the reference frame image to obtain a characteristic dot matrix located in the middle area, analyzes the contrast image to obtain the characteristic dot matrix, and determines the movement of the mouse by using the relative displacement of the characteristic dot matrix;
the calculation processing unit analyzes the reference frame image and the comparison image to respectively obtain an auxiliary dot matrix, and corrects errors of the direction and the moving distance by using the position relation between the auxiliary dot matrix and the characteristic dot matrix;
when the feature lattice of the comparison graph is located outside the middle area, the comparison graph is used as a new reference frame graph, and the feature lattice in the middle area is obtained again;
the system also comprises a laser range finder for measuring the distance between the bottom surface of the mouse and the placing plane and starting or ending the processing of the image by the computing and processing unit according to the change of the distance;
the optical navigation auxiliary system further comprises a plane unit, wherein a dot matrix formed by identification points which are uniformly distributed is arranged on the plane unit, the distance between the adjacent identification points is half of the side length of an image obtained when the image sensor is closest to the plane unit, a sub-point is arranged in the middle of each identification point, when the image sensor obtains the image of the plane unit irradiated by the light-emitting unit, the image is processed by the computing processing unit, the identification points are identified as characteristic dot matrixes, and the sub-points are identified as auxiliary dot matrixes;
the calculation processing unit processes the image according to the following steps:
s1, saving the first frame image acquired by the image sensor as a reference frame image;
s2, taking the middle area of the reference frame image as a reference area;
s3, extracting gray values of all pixel points in the reference frame image for characterization, and storing a lattice with the maximum characteristic value in the reference area as a characteristic lattice;
s4, storing the lattice with the second largest characteristic value in the reference area as an auxiliary lattice, calculating the distance between the characteristic lattice and the auxiliary lattice, and taking the connection direction of the characteristic lattice and the auxiliary lattice as a reference direction;
s5, saving the next frame image acquired by the image sensor as a comparison image;
s6, extracting gray values of all pixel points of the contrast map;
s7, performing characterization processing on all pixel points of the comparison graph, and screening out a dot matrix of which the characteristic value is greater than a threshold value;
s8, calculating the position relation and the characteristic value difference of the screened dot matrixes, matching the position relation and the characteristic value difference with the characteristic dot matrixes and the auxiliary dot matrixes of the reference frame image, and selecting two dot matrixes which accord with the matching result, wherein the dot matrix with a larger characteristic value is the characteristic dot matrix, and the dot matrix with a smaller characteristic value is the auxiliary dot matrix;
s9, connecting the characteristic dot matrix and the auxiliary dot matrix of the comparison graph to obtain a connecting line, and if the connecting line has deviation from the reference direction of the reference frame graph, performing rotation adjustment by taking the center of the comparison graph as a rotation point to enable the connecting line to be consistent with the reference direction;
s10, calculating the distance between the feature lattice and the auxiliary lattice in the contrast image, and taking the ratio of the distance between the feature lattice and the auxiliary lattice in the reference frame image as a scaling adjustment value;
s11, calculating the relative displacement of the characteristic lattice in the reference frame image and the characteristic lattice in the comparison image, and then adjusting the relative displacement by the scaling adjustment value to be used as the output value of the optical navigation, wherein the relative displacement comprises the moving distance and direction;
s12, if the feature lattice of the comparison graph is located in the middle area, jumping to the step S5;
s13, if the feature lattice of the comparison graph is located outside the middle area, the comparison graph is used as a new reference frame graph to be stored, and the step S2 is skipped;
the middle area of the image in the above step means: if the resolution of the original graph is n x n, the abscissa of the original graph isBetween, on the ordinateThe collection formed by the pixel points is a middle area, and the characteristic lattice and the auxiliary lattice are both 3-by-3 lattices;
the method of the characterization processing in the steps S3, S7 is implemented according to the following steps:
s21, counting the number of all pixel points with the same gray value, and taking the gray value of the pixel point with the largest number as the background gray value;
s22, subtracting the background gray value from the gray values of all the pixel points to serve as a new gray value to be stored;
s23, calculating the difference value Z of each non-edge pixel:
wherein, H () is the gray value of the pixel point, and (i, j) is the coordinate of the pixel point;
s24, calculating the characteristic value Y of each non-edge pixel:
and the characteristic value of the 3 x 3 lattice is the average value of the characteristic values of nine pixel points in the lattice.
2. An optical navigation aid system as claimed in claim 1, wherein ID information is provided around the identification point, the ID information has uniqueness, and the calculation processing unit corrects the accumulated movement error by identifying the ID information.
3. An optical navigation aid system as recited in claim 2, wherein said ID information includes a ratio of a distance from a corresponding identification point to a left edge of said planar unit to a right edge of said planar unit and a ratio of a distance from said corresponding identification point to an upper edge of said planar unit to a lower edge of said planar unit.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110115136.7A CN112799525B (en) | 2021-01-28 | 2021-01-28 | Optical navigation auxiliary system |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110115136.7A CN112799525B (en) | 2021-01-28 | 2021-01-28 | Optical navigation auxiliary system |
Publications (2)
Publication Number | Publication Date |
---|---|
CN112799525A CN112799525A (en) | 2021-05-14 |
CN112799525B true CN112799525B (en) | 2022-08-02 |
Family
ID=75812422
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110115136.7A Active CN112799525B (en) | 2021-01-28 | 2021-01-28 | Optical navigation auxiliary system |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN112799525B (en) |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1834878A (en) * | 2005-03-18 | 2006-09-20 | 安捷伦科技有限公司 | Optical navigation system |
CN103105943A (en) * | 2013-01-29 | 2013-05-15 | 广州中国科学院先进技术研究所 | Method of accurately locating displacement of optical mouse |
CN110796259A (en) * | 2018-08-03 | 2020-02-14 | 罗技欧洲公司 | Method and system for determining peripheral device displacement |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
TWI374401B (en) * | 2007-07-09 | 2012-10-11 | Pixart Imaging Inc | Detection method for displacements with sub-pixel accuracy and apparatus using the same |
TWI396116B (en) * | 2009-12-18 | 2013-05-11 | Pixart Imaging Inc | Displacement detecting apparatus and method |
US9269148B2 (en) * | 2012-06-11 | 2016-02-23 | Pixart Imaging Inc. | Displacement detection device |
US10210412B2 (en) * | 2014-03-13 | 2019-02-19 | Pixart Imaging Inc. | Optical encoder capable of identifying absolute positions and operating method thereof |
US9817488B2 (en) * | 2015-01-21 | 2017-11-14 | Pixart Imaging (Penang) Sdn. Bhd. | Optical navigation device and related automatic magnification adjusting method |
-
2021
- 2021-01-28 CN CN202110115136.7A patent/CN112799525B/en active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1834878A (en) * | 2005-03-18 | 2006-09-20 | 安捷伦科技有限公司 | Optical navigation system |
CN103105943A (en) * | 2013-01-29 | 2013-05-15 | 广州中国科学院先进技术研究所 | Method of accurately locating displacement of optical mouse |
CN110796259A (en) * | 2018-08-03 | 2020-02-14 | 罗技欧洲公司 | Method and system for determining peripheral device displacement |
Also Published As
Publication number | Publication date |
---|---|
CN112799525A (en) | 2021-05-14 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US6281931B1 (en) | Method and apparatus for determining and correcting geometric distortions in electronic imaging systems | |
US7024041B2 (en) | Pattern inspection apparatus and method | |
US20200380653A1 (en) | Image processing device and image processing method | |
US20130050710A1 (en) | Object detecting device and information acquiring device | |
US10976656B2 (en) | Defect inspection device and defect inspection method | |
US8269833B2 (en) | Method and system for measuring vehicle speed based on movement of video camera | |
KR20090101356A (en) | Defect detecting device, and defect detecting method | |
JP6161276B2 (en) | Measuring apparatus, measuring method, and program | |
JP4769764B2 (en) | Input method for entering points into the system, setting method, calibration method | |
US6519358B1 (en) | Parallax calculating apparatus, distance calculating apparatus, methods of the same, and information providing media | |
CN113658241A (en) | Monocular structured light depth recovery method, electronic device and storage medium | |
CN116503388A (en) | Defect detection method, device and storage medium | |
CN113379837A (en) | Angle correction method of detection device, device and computer readable storage medium | |
JPH02148279A (en) | Mark detecting system | |
US20110261231A1 (en) | Displacement detection device and displacement detection method thereof | |
JP2009294027A (en) | Pattern inspection device and method of inspecting pattern | |
CN112799525B (en) | Optical navigation auxiliary system | |
JPH0762606B2 (en) | Pattern position recognition device | |
KR19990037109A (en) | Bending angle detection device | |
JP2021043679A (en) | Image registration device, image generation system, image registration method and image registration program | |
CN105387808B (en) | Marginal position detection device and edge position detection | |
CN114283081A (en) | Depth recovery method based on pyramid acceleration, electronic device and storage medium | |
WO2020031980A1 (en) | Method for correcting lens marker image, correcting device, program, and recording medium | |
JP2961140B2 (en) | Image processing method | |
JP2861800B2 (en) | Shape measuring device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |