CN101656840B - Wide-angle sensor array module and image correcting method, operating method and application thereof - Google Patents

Wide-angle sensor array module and image correcting method, operating method and application thereof Download PDF

Info

Publication number
CN101656840B
CN101656840B CN2008101457945A CN200810145794A CN101656840B CN 101656840 B CN101656840 B CN 101656840B CN 2008101457945 A CN2008101457945 A CN 2008101457945A CN 200810145794 A CN200810145794 A CN 200810145794A CN 101656840 B CN101656840 B CN 101656840B
Authority
CN
China
Prior art keywords
image
imageing sensor
angle
wide
transition matrix
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN2008101457945A
Other languages
Chinese (zh)
Other versions
CN101656840A (en
Inventor
赵子毅
陈信嘉
林志新
古人豪
蔡彰哲
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Pixart Imaging Inc
Original Assignee
Pixart Imaging Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Pixart Imaging Inc filed Critical Pixart Imaging Inc
Priority to CN2008101457945A priority Critical patent/CN101656840B/en
Publication of CN101656840A publication Critical patent/CN101656840A/en
Application granted granted Critical
Publication of CN101656840B publication Critical patent/CN101656840B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Image Processing (AREA)

Abstract

The invention discloses a wide-angle sensor array module for generating a combined image. The wide-angle sensor array module comprises a first image sensor, a second image sensor, a storage unit and a processor. The first image sensor is used for acquiring a first image. The second image sensor is used for acquiring a second image and has a relative space relationship with the first image sensor. The storage unit is used for storing at least one conversion matrix, and the conversion matrix is solved according to the relative space relationship between the first image sensor and the second image sensor. The processor is used for combining the first image and the second image to form the combined image by using the conversion matrix. The invention also provides an image correcting method, an operation method and application for the wide-angle sensor array module.

Description

Wide-angle sensor array module and method for correcting image thereof, method of operation and application
Invention field
The present invention system is about a kind of imageing sensor, particularly about a kind of wide-angle sensor array module and method for correcting image, method of operation and application with multisensor.
Background technology
In the prior art, can pass through employing wide-angle lens or fish eye lens angular field of view, yet this type of camera lens has higher cost usually with the increase imageing sensor.Therefore, this area has proposed by utilizing a plurality of general imageing sensors to form the broad visual field and the method that reduces cost.For example U.S. Patent Publication discloses a kind of " utilizing a plurality of narrow visual angles image to form the digital image system (Digital imaging system for creating a wide-angle image frommultiple narrow angle images) of wide viewing angle image " for No. 2005/0025313, and it comprises a plurality of image devices.Each image device has default visual angle and in the mechanism of images acquired seclected time.The visual angle of each image device is set to be adjacent the visual angle overlaid of image device in the described image device.Described digital image system comprises that also control module controls described image device in images acquired separately of same time, and the image gathered of each image device will be synthesized and be the wide viewing angle image.
Therefore, how to synthesize a plurality of independent images and promptly become an important problem to form the wide viewing angle image.The invention provides a kind of wide-angle sensor array module that utilizes the synthetic a plurality of images of transition matrix, described transition matrix is the relative space relation between each independent image device of expression and is stored in advance in the described wide-angle sensor array module.In addition, if the relative space relation between image device can learn in advance that then the visual angle of each image device does not need the visual angle overlaid with the adjacent image device, can further increase the combination angular field of view of wide-angle sensor array module.
Summary of the invention
One object of the present invention is to provide a kind of wide-angle sensor array module, it utilizes at least two general imageing sensors to increase the angular field of view of sensor array module, thereby need not adopt the special lens of high price in the described wide-angle sensor array module, can effectively reduce the module cost.
Another object of the present invention is to provide a kind of method for correcting image of wide-angle sensor array module, its in advance according at least two imageing sensors relative space relations to each other to calculate at least one transition matrix.In addition,, learns in advance imageing sensor space to each other that then the visual angle of each image device does not need the visual angle overlaid with the adjacent image device, can and then increase the overall viewing angle scope when closing.
Further object of the present invention is to provide a kind of image manipulating method of wide-angle sensor array module, and it utilizes the transition matrix that is stored in advance in the described wide-angle sensor array module correctly to finish the synthetic of a plurality of images, thereby forms the wide viewing angle image.
Further object of the present invention is to provide a kind of application of wide-angle sensor array module, and wherein said wide-angle sensor array module is applied to point to navigation system, can effectively increase the angular field of view of described sensing navigation system and reduce system cost.
For reaching above-mentioned purpose, the invention provides a kind of wide-angle sensor array module that is used to produce a combination image, described wide-angle sensor array module comprises first imageing sensor, second imageing sensor, memory cell and processor.Described first imageing sensor is used to gather first image.Described second imageing sensor is used to gather second image, and described second imageing sensor and described first imageing sensor have relative space relation.At least one transition matrix of described cell stores, wherein this transition matrix is tried to achieve according to the described relative space relation of described first imageing sensor and described second imageing sensor.Described processor utilizes described transition matrix to make up described first image and described second image is combination image.
According to another characteristics of the present invention, the present invention also provides a kind of method for correcting image of wide-angle sensor array module, described wide-angle sensor array module comprises first imageing sensor and second imageing sensor, described method for correcting image comprises the following steps: to provide at least three check points, and wherein said check point forms to have with reference to space coordinates with reference to solid axes and described check point; Comprise that with the described first imageing sensor collection at least three check points are to form first image in the aforementioned corrected point, the check point that wherein said first image forms in the first image coordinate axle and described first image has first image coordinate; Try to achieve first transition matrix according to described with reference to space coordinates and described first image coordinate; Comprise that with the described second imageing sensor collection at least three check points are to form second image in the aforementioned corrected point, the check point that wherein said second image forms in the second image coordinate axle and described second image has second image coordinate; Try to achieve second transition matrix according to described with reference to space coordinates and described second image coordinate; And utilize described first transition matrix and/or described second transition matrix that described first image and/or described second image transitions are arrived virtual plane.
According to another characteristics of the present invention, the present invention provides a kind of method for correcting image of wide-angle sensor array module again, described wide-angle sensor array module comprises first imageing sensor and second imageing sensor, and described method for correcting image comprises the following steps: to determine the relative space relation of described first imageing sensor and described second imageing sensor; Try to achieve at least one transition matrix according to described relative space relation; And utilize image transitions that image that described transition matrix gathers described first imageing sensor and/or described second imageing sensor gather to virtual plane.
According to another characteristics of the present invention, the present invention provides a kind of method of operation of wide-angle sensor array module again, described wide-angle sensor array module comprises that first imageing sensor, second imageing sensor and cell stores have at least one transition matrix, this transition matrix is to try to achieve according to the relative space relation of described first imageing sensor and described second imageing sensor, and described method of operation comprises the following steps: to gather the 3rd image with described first imageing sensor; Gather the 4th image with described second imageing sensor; Utilize described transition matrix that described the 3rd image and/or described the 4th image transitions are arrived virtual plane; Determine at least one characteristic parameter of the subject image of gathering in the described virtual plane; And the described characteristic parameter of transmission is to image display device or storage device.
According to another characteristics of the present invention, the present invention provides a kind of method of operation of wide-angle sensor array module again, described wide-angle sensor array module comprises first imageing sensor, second imageing sensor and stores the memory cell of at least one transition matrix, described transition matrix is to try to achieve according to the relative space relation of described first imageing sensor and described second imageing sensor, and described method of operation comprises the following steps: to gather the 3rd image with described first imageing sensor; Gather first characteristic parameter of the subject image in described the 3rd image; Gather the 4th image with described second imageing sensor; Gather second characteristic parameter of the subject image in described the 4th image; Utilize described transition matrix that described first characteristic parameter and described second characteristic parameter are transformed into virtual plane; And transmit described characteristic parameter to image display device or storage device.
According to another characteristics of the present invention, the present invention provides a kind of sensing navigation system again, comprises wide-angle sensor array module, at least one light source and image display device.Described wide-angle sensor array module is used to produce combination image and has the combination visual angle, and described wide angle sensor array comprises: first imageing sensor is used to gather first image; Second imageing sensor is used to gather second image, and described second imageing sensor and described first imageing sensor have relative space relation; And graphics processing unit, store transition matrix and utilize this transition matrix to make up described first image and described second image is combination image, wherein said transition matrix is to try to achieve according to the relative space relation of described first imageing sensor and described second imageing sensor.Described light source is positioned at described combination visual angle.The described image display device described wide-angle sensor array module that is of coupled connections is used for the image of described light source is incorporated into described combination image and shows described combination image and described light source image.
Wide-angle sensor array module of the present invention and method of operation thereof and application, be used at least one transition matrix of trying to achieve in advance in the correction mode and carry out the synthetic of a plurality of images, can effectively increase the overall viewing angle of described wide-angle sensor array module, to increase its practicality.In addition, owing to need not use special lens in the described wide-angle sensor array module, can effectively reduce cost.
Description of drawings
Fig. 1 a is the schematic diagram of the wide-angle sensor array module of the embodiment of the invention;
Fig. 1 b is another schematic diagram of the wide-angle sensor array module of the embodiment of the invention;
Fig. 2 is the flow chart of method of operation of the wide-angle sensor array module of the embodiment of the invention;
Fig. 3 is the flow chart of correction mode of the wide-angle sensor array module of the embodiment of the invention;
Fig. 4 a is the operation chart of correction mode of the wide-angle sensor array module of the embodiment of the invention;
Fig. 4 b is the top view of operation chart of correction mode of the wide-angle sensor array module of Fig. 4 a;
Fig. 5 a has shown that the reference solid axes with anchor point is the schematic diagram of the anchor point image of benchmark;
Fig. 5 b has shown that the first image coordinate axle with first imageing sensor is the schematic diagram of the anchor point image gathered of benchmark;
Fig. 5 c has shown that the second image coordinate axle with second imageing sensor is the schematic diagram of the anchor point image gathered of benchmark;
Fig. 5 d has shown the schematic diagram of the Coordinate Conversion of anchor point image between different reference axis;
Fig. 6 a is another operation chart of correction mode of the wide-angle sensor array module of the embodiment of the invention, wherein uses 6 anchor points to proofread and correct;
Fig. 6 b has shown among Fig. 6 a the schematic diagram of the Coordinate Conversion of anchor point image between different reference axis;
Fig. 7 a is another operation chart of correction mode of the wide-angle sensor array module of the embodiment of the invention, wherein uses 8 anchor points to proofread and correct;
Fig. 7 b has shown among Fig. 7 a the schematic diagram of the Coordinate Conversion of anchor point image between different reference axis;
Fig. 8 a is another operation chart of correction mode of the wide-angle sensor array module of the embodiment of the invention;
Fig. 8 b is the top view of operation chart of correction mode of the wide-angle sensor array module of Fig. 8 a;
Fig. 9 a has shown that the reference solid axes with anchor point is the schematic diagram of the anchor point image of benchmark among Fig. 8 a;
Fig. 9 b has shown among Fig. 8 a that the first image coordinate axle with first imageing sensor is the schematic diagram of the anchor point image gathered of benchmark;
Fig. 9 c has shown among Fig. 8 a that the second image coordinate axle with second imageing sensor is the schematic diagram of the anchor point image gathered of benchmark;
Figure 10 a is the flow chart of operator scheme of the wide-angle sensor array module of the embodiment of the invention;
Figure 10 b is another flow chart of operator scheme of the wide-angle sensor array module of the embodiment of the invention;
Figure 11 a is the application schematic diagram of operator scheme of the wide-angle sensor array module of the embodiment of the invention;
Figure 11 b has shown the image that first imageing sensor of the wide-angle sensor array module of Figure 11 a is gathered;
Figure 11 c has shown the image that second imageing sensor of the wide-angle sensor array module of Figure 11 a is gathered;
Figure 11 d has shown the combination image that the wide-angle sensor array module of Figure 11 a is gathered.
The main element symbol description
1,1 ' wide-angle sensor array module, 2 graphics processing units
21 processors, 22 memory cell
23 coffret unit VA, VA ' makes up the visual angle
S 1The first imageing sensor S 2Second imageing sensor
VA 1The first visual angle VA 2Second visual angle
P 1~P 4Anchor point P 1'~P 4' the anchor point image
P 1"~P 4" the central normal at anchor point image n combination visual angle
n 1The central normal of first imageing sensor
n 2The central normal of second imageing sensor
θ 1, the central normal and the n at-Φ combination visual angle 1Between lateral angles poor
θ 2, the central normal and the n at Φ combination visual angle 2Between lateral angles poor
Vertical differential seat angle of the image coordinate z between centers of θ anchor point space coordinates Z axle
31~32 steps 310~313 steps
3211~3215 steps 3221~3226 steps
T 3, T 4Light source I 3The 3rd image
I 4The 4th image C I combination image
H 1, H 2Transition matrix
Embodiment
In order to allow above-mentioned and other purposes of the present invention, feature and the advantage can be more obvious, the embodiment of the invention cited below particularly, and cooperate appended diagram, be described in detail below.In the description of this specification, similar elements is to represent with identical numbering.
Please refer to shown in Fig. 1 a and the 1b, it has shown according to the wide-angle sensor array module 1 of the embodiment of the invention and 1 ' schematic diagram.Described wide-angle sensor array module 1 and 1 ' comprises the first imageing sensor S 1, the second imageing sensor S 2And graphics processing unit 2, it comprises processor 21, memory cell 22 and coffret unit 23.In addition, described wide-angle sensor array module 1 and 1 ' can also comprise and is arranged at the described first imageing sensor S 1With the described second imageing sensor S 2A plurality of lens in the place ahead or set of lenses (not shown) are used to improve the light receiving efficiency of described imageing sensor; Described graphics processing unit 2 can also comprise the electric power supply unit (not shown) of electric power required when being used to described wide-angle sensor array module 1 and 1 ' operation are provided.
The described first and second imageing sensor S 1And S 2For example can be ccd image sensor or cmos image sensor, but be not limited to this.The described first imageing sensor S 1Be used for images acquired, and have the first visual angle VA 1The described second imageing sensor S 2Be used for images acquired, and have the second visual angle VA 2Described wide-angle sensor array module 1 and 1 ' then is used to gather combination image, and has combination visual angle VA and VA ' respectively, and this combination visual angle VA and VA ' are by the described first visual angle VA 1With the described second visual angle VA 2Combine, so VA (VA ')>VA 1And VA (VA ')>VA 2In addition, though only do explanation among this embodiment with two imageing sensors, actually according to different application, described wide-angle sensor array module 1 and 1 ' can have two above imageing sensors.
Shown in Fig. 1 a, in the described wide-angle sensor array module 1, the described first imageing sensor S 1With the described second imageing sensor S 2Be provided with abreast, that is to say the described first imageing sensor S 1The first visual angle VA 1Central normal n 1, the described second imageing sensor S 2The second visual angle VA 2Central normal n 2And do not have differential seat angle between the central normal of described combination visual angle VA.The described first imageing sensor S 1With the described second imageing sensor S 2Described graphics processing unit 2 is of coupled connections.The processor 21 of described graphics processing unit 2 is used to handle the described first imageing sensor S 1With the described second imageing sensor S 2The image of being gathered, for example image rectification, image are synthetic, collection apparatus and distortion compensation etc.Described memory cell 22 is used to store the described first imageing sensor S 1With the described second imageing sensor S 221 processed images of image of being gathered and/or described processor, and store a plurality of parameters and handle the described first imageing sensor S for described processor 21 1With the described second imageing sensor S 2The image of being gathered, for example described first imageing sensor S 1With the described second imageing sensor S 2Between space length and described central normal n 1, n 2And the angle theta (θ among Fig. 1 a=0 °) between n.Described coffret unit 23 is used for described wide-angle sensor array module 1 and the 1 ' image of being gathered be transferred to that for example image display (not shown) shows or memory storage is stored.The present invention utilizes described graphics processing unit 2 to handle and the synthetic described first imageing sensor S 1With the described second imageing sensor S 2The image of being gathered is so that described wide-angle sensor array module 1 can have broad combination visual angle VA.
Shown in Fig. 1 b, described wide-angle sensor array module 1 ' is the alternate embodiment of described wide-angle sensor array module 1, the wherein said first imageing sensor S 1With the described second imageing sensor S 2Has differential seat angle θ=θ to each other 1+ θ 2, that is the described first visual angle VA 1Central normal n 1And has the first differential seat angle θ between the central normal of described combination visual angle VA 1The described second visual angle VA 2Central normal n 2And has the second differential seat angle θ between the central normal of described combination visual angle VA 2, the wherein said first differential seat angle θ 1Can equal or be not equal to the described second differential seat angle θ 2In this embodiment, store the described first imageing sensor S in the described memory cell 22 1With the described second imageing sensor S 2Between the various parameters of relative space relation, for example space length, the described first differential seat angle θ 1, the described second differential seat angle θ 2And try to achieve transition matrix between image according to described parameter; 21 of described processors utilize described transition matrix to form virtual plane.In this alternate embodiment, described graphics processing unit 2 same processing and the synthetic described first imageing sensor S 1With the described second imageing sensor S 2The image of being gathered is so that described wide-angle sensor array module 1 ' has broad combination visual angle VA ', wherein VA '>VA.In addition, wide-angle sensor array module 1 of the present invention and 1 ' can be applicable to point to navigation system, and for example the light gun of light gun recreation points to the sensing location of location or localizer (pointer), to reduce the use angle limitations of pointing to navigation system.
Please refer to shown in Figure 2ly, it has shown the wide-angle sensor array module 1 of the embodiment of the invention and the flow chart of 1 ' method of operation, and it comprises correction mode 31 and operator scheme 32.Described correction mode 31 dispatches from the factory preceding or performed before using for the first time at described wide-angle sensor array module 1 and 1 ', is used to obtain the described first imageing sensor S 1With the described second imageing sensor S 2Between the various parameters of relative space relation, and try to achieve transition matrix between two images according to this to form virtual plane; Described operator scheme 32 comprises the described first imageing sensor S 1With the described second imageing sensor S 2The image of being gathered is synthesized to the steps such as object features in described virtual plane and the images acquired.The detailed executive mode of described correction mode 31 and described operator scheme 32 will illustrate in following paragraph.
Please refer to shown in Figure 3, it has shown the flow chart of the correction mode 31 of the embodiment of the invention, comprise the following steps: to provide at least three check points in the visual angle of first imageing sensor and second imageing sensor, wherein said check point has with reference to solid axes (step 310); Gather first image with described first imageing sensor, it has the first image coordinate axle (step 3111); Try to achieve the image coordinate (step 3112) of check point described in described first image; Try to achieve first transition matrix (step 3113) according to described relative space relation with reference to solid axes and the described first image coordinate axle; Gather second image with described second imageing sensor, it has the second image coordinate axle (step 3121); Try to achieve the image coordinate (step 3122) of check point described in described second image; Try to achieve second transition matrix (step 3123) according to described relative space relation with reference to solid axes and the described second image coordinate axle; And utilize described first transition matrix with described first image transitions to virtual plane and/or utilize described second transition matrix with described second image transitions to described virtual plane (step 313).
Please refer to shown in Fig. 4 a and the 4b, Fig. 4 a shows a kind of execution mode of the correction mode 31 of the embodiment of the invention, and Fig. 4 b then is the top view of Fig. 4 a.In this execution mode, utilize described wide-angle sensor array module 1 ' to gather 4 anchor point P in the space 1~P 4Image with as correction parameter, wherein said anchor point P 1~P 4For example can be the radium-shine diode of infrared light light-emittingdiode or infrared light, and described anchor point P 1~P 4Be positioned at the described first visual angle VA 1With the described second visual angle VA 2Overlapping region (step 310).Described anchor point P 1~P 4Have { x, y, the reference solid axes of z} axle; Described wide-angle sensor array module 1 ' based on X, Y, the solid axes images acquired of Z} axle, and described first and second imageing sensors have image coordinate axle separately, for example described first imageing sensor has { X 1, Y, Z 1The axle the first image coordinate axle; Described second imageing sensor has { X 2, Y, Z 2The axle the second image coordinate axle.In this execution mode, the described first imageing sensor S 1With the described second imageing sensor S 2Between have the lateral angles difference and do not have vertical differential seat angle, for example described first imageing sensor S 1Central normal n 1(X 1, 0, Z 1) and described anchor point P 1~P 4The z between centers of reference solid axes have the first differential seat angle θ 1=-Φ; The described second imageing sensor S 2Central normal n 2(X 2, 0, Z 2) and described anchor point P 1~P 4The z between centers of reference solid axes have the second differential seat angle θ 2=Φ (Fig. 4 b).
Fig. 5 a shows with described anchor point P 1~P 4The reference solid axes be the anchor point image of benchmark.Fig. 5 b shows the described first imageing sensor S 1The anchor point image P that is gathered 1'~P 4' (first image), it is a benchmark (step 3111) with the described first image coordinate axle, and can calculate the image coordinate (step 3112) of each anchor point in described first image.Then can try to achieve described transition matrix H by coordinate transform with reference to solid axes and the described first image coordinate between centers 1Suc as formula (1) (step 3113):
H 1 = cos ( - Φ ) 0 - sin ( - Φ ) 0 1 0 sin ( - Φ ) 0 cos ( - Φ ) - - - ( 1 )
That is to say, if with described anchor point P 1~P 4The reference solid axes be the anchor point P of benchmark 1~P 4Image coordinate is known as that (x, y z), then can pass through described transition matrix H 1(x, y z) are converted to the described first imageing sensor S with image coordinate 1The first image coordinate axle be the image coordinate of benchmark suc as formula (2),
H 1 × x y z = x cos ( - Φ ) - z sin ( - Φ ) y x sin ( - Φ ) + z cos ( - Φ ) - - - ( 2 )
In like manner, Fig. 5 c shows the anchor point image P that the described second imageing sensor S2 is gathered 1"~P 4" (second image), it is a benchmark (step 3121) with the described second image coordinate axle, and can calculate the image coordinate (step 3122) of each anchor point in described second image.In like manner, then can try to achieve described transition matrix H by coordinate transform with reference to solid axes and the described second image coordinate between centers 2Suc as formula (3) (step 3123):
H 2 = cos ( Φ ) 0 - sin ( Φ ) 0 1 0 sin ( Φ ) 0 cos ( Φ ) - - - ( 3 )
That is to say, if with described anchor point P 1~P 4The reference solid axes be the anchor point P of benchmark 1~P 4Image coordinate is known as that (x, y z), then can pass through described transition matrix H 2(x, y z) are converted to the described second imageing sensor S with image coordinate 2The second image coordinate axle be the image coordinate of benchmark suc as formula (4),
H 2 × x y z = x cos ( Φ ) - z sin ( Φ ) y x sin ( Φ ) + z cos ( Φ ) - - - ( 4 )
The above-mentioned transition matrix H that tries to achieve 1And H 2Represent the described first imageing sensor S 1With the described second imageing sensor S 2Relative spatial relationship is utilized the described first transition matrix H at last 1With described first image transitions to virtual plane and/or utilize the described second transition matrix H 2To described virtual plane (step 313), wherein said virtual plane can be with described anchor point P with described second image transitions 1~P 4The reference solid axes be benchmark (be described first image and described second image are transformed into by transition matrix respectively described) with reference to solid axes, with the described first imageing sensor S 1The first image coordinate axle be benchmark (being that described second image is transformed into the described first image coordinate axle by transition matrix) or with the described second imageing sensor S 2The second image coordinate axle be benchmark (being that described first image is transformed into the described second image coordinate axle by transition matrix).
In addition, when the relative space relation between imageing sensor is the unknown, utilize the described first imageing sensor S 1The anchor point image P that is gathered 1'~P 4' coordinate and predefined described anchor point P 1~P 4Image coordinate, for example (x, y z), can try to achieve the first transition matrix H according to formula (2) 1In like manner, utilize the described second imageing sensor S 2The anchor point image P that is gathered 1"~P 4" coordinate and predefined described anchor point P 1~P 4Image coordinate, can try to achieve the second transition matrix H according to formula (4) 2
Please refer to shown in Fig. 5 d, its demonstration is transformed into described first image and described second image embodiment of described virtual plane by transition matrix.For example work as with described anchor point P 1~P 4Reference solid axes when being benchmark, described positioning point coordinate is P 1=(1,1), P 2=(1,1), P 3=(1 ,-1) and P 4=(1 ,-1).If the relative space relation (being the known transition matrix) between the known image transducer then can see through described transition matrix H 1The image coordinate (the lower-left figure of Fig. 5 d) that to be converted to the described first image coordinate axle be benchmark or see through described transition matrix H 2The image coordinate (bottom-right graph of Fig. 5 d) that to be converted to the described second image coordinate axle be benchmark.In addition, if be benchmark with described first image coordinate, described anchor point P 1"~P 4" then to can be switched to the described first image coordinate axle through twice transition matrix be on the virtual plane of benchmark, and corresponding to respectively anchor point P 1'~P 4'; In like manner, if be benchmark with the described second image coordinate axle, described anchor point P 1'~P 4' then to can be switched to the described second image coordinate axle through twice transition matrix be on the virtual plane of benchmark, and corresponding to respectively anchor point P 1"~P 4".
Scrutable is that the number of anchor point is not limited to 4.Since need 3 anchor points could form at least with reference to solid axes, therefore in the present invention, the described first visual angle VA 1With the described second visual angle VA 2Field of view at least 3 anchor points need be set respectively.Fig. 6 a and 6b then show the execution mode that utilizes 6 anchor points in the correction mode 31 of the present invention; Fig. 7 a and 7b then show the execution mode that utilizes 8 anchor points in the correction mode 31 of the present invention, because of the account form of its transition matrix is similar to the execution mode that uses 4 anchor points, repeat no more in this.
Please refer to shown in Fig. 8 a and the 8b, Fig. 8 a shows the alternate embodiment of the correction mode 31 of the embodiment of the invention, and Fig. 8 b then is the top view of Fig. 8 a.In this alternate embodiment, described wide-angle sensor array module 1 ' is gathered 4 anchor point P in the space equally 1~P 4Image with as correction parameter, wherein said anchor point P 1~P 4Be positioned at the described first visual angle VA 1With the described second visual angle VA 2Overlapping region (step 310).Described anchor point P 1~P 4Have { x, y, the reference solid axes of z} axle; Described wide-angle sensor array module 1 ' based on X, Y, the solid axes images acquired of Z} axle, and described first and second imageing sensors have image coordinate axle separately, for example described first imageing sensor has { X 1, Y, Z 1The axle the first image coordinate axle; Described second imageing sensor has { X 2, Y, Z 2The axle the second image coordinate axle.In alternate embodiment, the described first imageing sensor S 1With the described second imageing sensor S 2Between have lateral angles difference and vertical differential seat angle, for example described first imageing sensor S 1Central normal n 1(X 1, 0, Z 1) and described anchor point P 1~P 4The z between centers of reference solid axes have the horizontal first differential seat angle θ 1=-Φ and vertical differential seat angle θ; The described second imageing sensor S 2Central normal n 2(X 2, 0, Z 2) and described anchor point P 1~P 4The z between centers of reference solid axes have the horizontal second differential seat angle θ 2=Φ and vertical differential seat angle θ, the wherein said first imageing sensor S 1Central normal n 1With described anchor point P 1~P 4Vertical differential seat angle of z between centers of reference solid axes also can be not equal to the described second imageing sensor S 2Central normal n 2With described anchor point P 1~P 4Vertical differential seat angle of z between centers of reference solid axes.
Please refer to shown in Fig. 9 a to 9c, Fig. 9 a shows with described anchor point P 1~P 4The reference solid axes be the anchor point image of benchmark.Fig. 9 b shows the described first imageing sensor S 1The anchor point image P that is gathered 1'~P 4', it is a benchmark (step 3111) with the described first image coordinate axle, and can try to achieve each anchor point image coordinate (step 3112).Then can try to achieve described x direction of principal axis transition matrix H by coordinate transform with reference to solid axes and the described first image coordinate between centers 1xSuc as formula (5) and y direction of principal axis transition matrix H 1ySuc as formula (6) (step 3113):
H 1 x = cos ( - Φ ) 0 - sin ( - Φ ) 0 1 0 sin ( - Φ ) 0 cos ( - Φ ) - - - ( 5 )
H 1 y = 1 0 0 0 cos ( θ ) - sin ( θ ) 0 sin ( θ ) cos ( θ ) - - - ( 6 )
Then can try to achieve described transition matrix H with reference to solid axes and the described first image coordinate between centers 1Suc as formula (7):
H 1 = H 1 x × H 1 y = cos ( - Φ ) - sin ( - Φ ) sin ( θ ) - sin ( - Φ ) cos ( θ ) 0 cos ( θ ) - sin ( θ ) sin ( - Φ ) cos ( - Φ ) sin ( θ ) cos ( - Φ ) cos ( θ ) - - - ( 7 )
That is to say, if with described anchor point P 1~P 4The reference solid axes be the anchor point P of benchmark 1~P 4Image coordinate is known as that (x, y z), then can pass through described transition matrix H 1Be that (x, y z) are converted to the described first imageing sensor S with image coordinate 1The first image coordinate axle be the image coordinate of benchmark suc as formula (8),
H 1 × x y z = x cos ( - Φ ) - ( y sin ( θ ) + z cos ( θ ) ) sin ( - Φ ) y cos ( θ ) - z sin ( θ ) x sin ( - Φ ) + ( y sin ( θ ) + z cos ( θ ) ) cos ( - Φ ) - - - ( 8 )
In like manner, Fig. 9 c shows the described second imageing sensor S 2The anchor point image P that is gathered 1"~P 4", it is benchmark (step 3 121) with described second image coordinate, can try to achieve each positioning point coordinate (step 3122) equally.Then can try to achieve described x direction of principal axis transition matrix H by coordinate transform with reference to solid axes and the described second image coordinate between centers 2xSuc as formula (9) and y direction of principal axis transition matrix H 2ySuc as formula (10) (step 3123):
H 2 x = cos ( Φ ) 0 - sin ( Φ ) 0 1 0 sin ( Φ ) 0 cos ( Φ ) - - - ( 9 )
H 2 y = 1 0 0 0 cos ( θ ) - sin ( θ ) 0 sin ( θ ) cos ( θ ) - - - ( 10 )
Then can try to achieve described transition matrix H with reference to solid axes and the described second image coordinate between centers 2Suc as formula (11):
H 2 = H 2 x × H 2 y = cos ( Φ ) - sin ( Φ ) sin ( θ ) - sin ( Φ ) cos ( θ ) 0 cos ( θ ) - sin ( θ ) sin ( Φ ) cos ( Φ ) sin ( θ ) cos ( Φ ) cos ( θ ) - - - ( 11 )
That is to say, if with described anchor point P 1~P 4The reference solid axes be the anchor point P of benchmark 1~P 4Image coordinate is known as that (x, y z), then can pass through described transition matrix H 2Be that (x, y z) are converted to the described second imageing sensor S with image coordinate 2The second image coordinate axle be the image coordinate of benchmark suc as formula (12),
H 2 × x y z = x cos ( Φ ) - ( y sin ( θ ) + z cos ( θ ) ) sin ( Φ ) y cos ( θ ) - z sin ( θ ) x sin ( Φ ) + ( y sin ( θ ) + z cos ( θ ) ) cos ( Φ ) - - - ( 12 )
In addition, when the relative space relation between imageing sensor is the unknown, utilize the described first imageing sensor S 1The anchor point image P that is gathered 1'~P 4' coordinate and predefined described anchor point P 1~P 4Image coordinate, for example (x, y z), can try to achieve the first transition matrix H according to formula (8) 1In like manner, utilize the described second imageing sensor S 2The anchor point image P that is gathered 1"~P 4" coordinate and predefined described anchor point P 1~P 4Image coordinate, can try to achieve the second transition matrix H according to formula (12) 2
Carry out the described transition matrix (H that described correction mode 31 backs are tried to achieve 1, H 2) and virtual plane, the memory cell 22 that is stored in described graphics processing unit 2 is in advance used in operator scheme 32 for described processor 21.
In another alternate embodiment, the described first imageing sensor S 1With the described second imageing sensor S 2Can be arranged on the precast frame (not shown), and as the described first imageing sensor S 1With the described second imageing sensor S 2When being arranged at described framework, imageing sensor relative space relation is to each other learnt in advance.For example know the described first imageing sensor S in advance 1Central normal and the described second imageing sensor S 2Central normal between the lateral angles difference that had and/or vertical differential seat angle.Therefore in the correction mode 31 of this alternate embodiment, then must not be respectively provided to few three anchor points in the described first imageing sensor S 1With the described second imageing sensor S 2Field of view in, can directly utilize imageing sensor relative space relation to each other calculating transition matrix, and utilize described transition matrix to try to achieve virtual plane, at last described transition matrix and virtual plane are stored in the described memory cell 22.At this moment, described virtual plane can be with the described first imageing sensor S 1The first image coordinate axle be that benchmark (is the described second imageing sensor S 2The image of being gathered can be transformed into the described first image coordinate axle by described transition matrix) or with the described second imageing sensor S 2The second image coordinate axle be that benchmark (is the described first imageing sensor S 1The image of being gathered can be transformed into the described second image coordinate axle by described transition matrix).
Referring again to shown in Figure 2, as previously mentioned, described correction mode 31 dispatches from the factory preceding for described wide-angle sensor array module 1 and 1 ' or uses preceding performed initial calibration program, then need not repeat described correction mode 31 in actual operational phase (operator scheme 32).Below follow description operation mode 32.
Please refer to shown in Figure 10 a, the flow chart of the wide-angle sensor array module 1 of its demonstration embodiment of the invention and 1 ' operator scheme 32, it comprises with first imageing sensor gathers the 3rd image (step 3211); Gather the 4th image (step 3212) with second imageing sensor; Utilize described transition matrix that described the 3rd image and described the 4th image are synthesized to virtual plane (step 3213); Gather the object features (step 3214) in the described combination image and transmit described object features parameter (step 3215).
Please refer to shown in Figure 10 a and the 11a to 11d, below utilize Application Example that the operator scheme 32 of described wide-angle sensor array module 1 and 1 ' is described.At this moment, described wide-angle sensor array module 1 ' is applied to point to navigation system, and the described first imageing sensor S 1The first visual angle VA 1With the described second imageing sensor S 2The second visual angle VA 2In for example have the sight of light gun or the light source T of localizer (pointer) respectively 3And T 4At first use the described first imageing sensor S 1Gather the 3rd image I 3(step 3211) is also with the described second imageing sensor S 2Gather the 4th image I 4(step 3212); Follow described processing unit 21 and utilize the transition matrix that is stored in described memory cell 22 in advance, for example the transition matrix H that tried to achieve of correction mode 31 1And H 2, with described the 3rd image I 3With described the 4th image I 4Be synthesized on the virtual plane to form combination image CI, it has combination visual angle VA.As previously mentioned, described virtual plane can be with described the 3rd image I 3Or described the 4th image I 4The image coordinate axle be benchmark; Follow described processor 21 and gather the characteristic parameter of subject image among the described combination image CI, for example parameters (step 3214) such as the coordinate of object, area, color, directivity, border, number of endpoint and length-width ratio; By described coffret unit 23 described object parameters is sent to image display or storage device (not shown) at last, wherein said image display for example can be video screen, projection screen, game machine screen or computer screen, and it can be with described light source T 3And T 4Image coordinate be incorporated into described combination image CI and show described light source T 3And T 4And the image of described combination image CI.
Please refer to shown in Figure 10 b and the 11a to 11d, the wide-angle sensor array module 1 of the embodiment of the invention and another alternate embodiment of 1 ' operator scheme 32 then are described, comprise the following steps: to gather the 3rd image (step 3221) with first imageing sensor; Gather the object features (step 3222) of the 3rd image; Gather the 4th image (step 3223) with second imageing sensor; Gather the object features (step 3224) of the 4th image; Utilize described transition matrix with described first and described second characteristic parameter be synthesized to virtual plane (step 3226); And transmit described object features parameter (step 3215).This alternate embodiment is with the difference of Figure 10 a, after the described the 3rd and the 4th image is gathered, the characteristic parameter of object in 11 first images acquired of difference of described processor utilizes the transition matrix that is stored in the described memory cell 22 that described object parameters is synthesized to described virtual plane again.Last same described object parameters is sent to image display or storage device through described coffret unit 23.Thus, the present invention can effectively increase the angular field of view (as Figure 11 d) of pointing to navigation system by at least two imageing sensors being set in wide-angle sensor array module 1 and 1 '.
In sum, because can be by using a plurality of imageing sensors to increase the angular field of view of image sensing module, therefore how correctly synthetic a plurality of images become an important problem.In view of this, the present invention proposes a kind of wide-angle sensor array module and method for correcting image and method of operation, see through and store at least one transition matrix in described wide-angle sensor array module in advance, and when operation, utilize described transition matrix combination image correctly.In addition, wide-angle sensor array module of the present invention can be applicable to point to navigation system, and for example gunslinging recreation can effectively increase the operation angular field of view of pointing to navigation system, and reduces system cost.
Though the present invention is open with above preferred embodiment, so it is not to be used to limit the present invention, any persond having ordinary knowledge in the technical field of the present invention, and without departing from the spirit and scope of the present invention, various changes can be made and modification.Therefore protection scope of the present invention is when being as the criterion according to the additional desired scope of claims.

Claims (20)

1. a wide-angle sensor array module is used to produce combination image, and this wide-angle sensor array module comprises:
First imageing sensor is used to gather first image;
Second imageing sensor is used to gather second image, and this second imageing sensor and described first imageing sensor have the lateral angles difference and/or the vertical relative space relation of differential seat angle;
Memory cell is stored at least one transition matrix, and wherein this transition matrix is to try to achieve according to the described relative space relation of described first imageing sensor and described second imageing sensor; And
Processor utilizes described transition matrix that described first image and described second image sets are combined into combination image.
2. wide-angle sensor array module according to claim 1, wherein said relative space relation are the horizontal direction angle between described first imageing sensor and described second imageing sensor.
3. wide-angle sensor array module according to claim 2, wherein said horizontal direction angle is 2 Ф, described transition matrix is:
cos ( - Φ ) 0 - sin ( - Φ ) 0 1 0 sin ( - Φ ) 0 cos ( - Φ ) And/or cos ( Φ ) 0 - sin ( Φ ) 0 1 0 sin ( Φ ) 0 cos ( Φ ) .
4. wide-angle sensor array module according to claim 1, wherein said relative space relation are horizontal direction angle and the longitudinal direction angle between described first imageing sensor and described second imageing sensor.
5. wide-angle sensor array module according to claim 4, wherein said horizontal direction angle is 2 Ф, and described longitudinal direction angle is θ, and described transition matrix is:
cos ( - Φ ) - sin ( - Φ ) sin ( θ ) - sin ( - Φ ) cos ( θ ) 0 cos ( θ ) - sin ( θ ) sin ( - Φ ) cos ( - Φ ) sin ( θ ) cos ( - Φ ) cos ( θ ) And/or
cos ( Φ ) - sin ( Φ ) sin ( θ ) - sin ( Φ ) cos ( θ ) 0 cos ( θ ) - sin ( θ ) sin ( Φ ) cos ( Φ ) sin ( θ ) cos ( Φ ) cos ( θ ) .
6. wide-angle sensor array module according to claim 1, also comprise the framework that is used to be provided with described first imageing sensor and described second imageing sensor, so that have described relative space relation between described first imageing sensor and described second imageing sensor.
7. the method for correcting image of a wide-angle sensor array module, described wide-angle sensor array module comprises first imageing sensor and second imageing sensor, described method for correcting image comprises the following steps:
At least three check points are provided, and wherein said check point forms to have with reference to space coordinates with reference to solid axes and described check point;
Comprise that with the described first imageing sensor collection at least three check points are to form first image in the aforementioned corrected point, the check point that wherein said first image forms in the first image coordinate axle and described first image has first image coordinate;
Try to achieve first transition matrix according to described with reference to space coordinates and described first image coordinate;
Comprise that with the described second imageing sensor collection at least three check points are to form second image in the described check point, the check point that wherein said second image forms in the second image coordinate axle and described second image has second image coordinate;
Try to achieve second transition matrix according to described with reference to space coordinates and described second image coordinate; And
Utilize described first transition matrix and/or described second transition matrix with described first image and/or described second image transitions to one of them is the virtual plane of benchmark with reference to solid axes, the described first image coordinate axle and the described second image coordinate axle with described.
8. method for correcting image according to claim 7, wherein said is the three-dimensional cartesian coordinate system system with reference to solid axes, the described first image coordinate axle and the described second image coordinate axle.
9. method for correcting image according to claim 7, wherein when virtual plane based on described during with reference to solid axes, utilize described first transition matrix with described first image transitions to described virtual plane with utilize described second transition matrix that described second image transitions is arrived described virtual plane.
10. method for correcting image according to claim 7, wherein when virtual plane during based on the described first image coordinate axle, utilize described first and second transition matrixes with described second image transitions to described virtual plane.
11. method for correcting image according to claim 7, wherein when virtual plane during based on the described second image coordinate axle, utilize described first and second transition matrixes with described first image transitions to described virtual plane.
12. the method for correcting image of a wide-angle sensor array module, described wide-angle sensor array module comprise first imageing sensor and second imageing sensor, described method for correcting image comprises the following steps:
According to the lateral angles difference of described first imageing sensor and described second imageing sensor and/or vertically differential seat angle determine the relative space relation of described first imageing sensor and described second imageing sensor;
Try to achieve at least one transition matrix according to described relative space relation; And
Utilize image transitions that image that described transition matrix gathers described first imageing sensor and/or described second imageing sensor gather to being the virtual plane of benchmark with the reference axis of described first imageing sensor or the reference axis of described second imageing sensor.
13. method for correcting image according to claim 12, the visual angle of wherein said first imageing sensor and second imageing sensor can be overlapping or not overlapping.
14. the method for operation of a wide-angle sensor array module, described wide-angle sensor array module comprises first imageing sensor, second imageing sensor and stores the memory cell of at least one transition matrix, this transition matrix be the lateral angles difference that has according to described first imageing sensor and described second imageing sensor and/or vertically the relative space relation of differential seat angle try to achieve, described method of operation comprises the following steps:
Gather the 3rd image with described first imageing sensor;
Gather the 4th image with described second imageing sensor;
Utilize described transition matrix with described the 3rd image and/or described the 4th image transitions to being the virtual plane of benchmark with the reference axis of described first imageing sensor or the reference axis of described second imageing sensor; And
Gather at least one characteristic parameter of the subject image of gathering in the described virtual plane.
The combination of one or more in the group that 15. method of operation according to claim 14, wherein said characteristic parameter are the coordinate that is selected from object, area, color, directivity, border, number of endpoint and length-width ratio to be constituted.
16. the method for operation of a wide-angle sensor array module, described wide-angle sensor array module comprises first imageing sensor, second imageing sensor and stores the memory cell of at least one transition matrix, this transition matrix be the lateral angles difference that has according to described first imageing sensor and described second imageing sensor and/or vertically the relative space relation of differential seat angle try to achieve, described method of operation comprises the following steps:
Gather the 3rd image with described first imageing sensor;
Gather first characteristic parameter of the subject image in described the 3rd image;
Gather the 4th image with described second imageing sensor;
Gather second characteristic parameter of the subject image in described the 4th image; And
Utilizing described transition matrix to be transformed into described first characteristic parameter and described second characteristic parameter with the reference axis of described first imageing sensor or the reference axis of described second imageing sensor is the virtual plane of benchmark.
The combination of one or more in the group that 17. method of operation according to claim 16, wherein said first and second characteristic parameters are the coordinate that is selected from object, area, color, directivity, border, number of endpoint and length-width ratio to be constituted.
18. method of operation according to claim 16, the described relative space relation of wherein said first imageing sensor and described second imageing sensor are horizontal direction angle and/or longitudinal direction angle between described first imageing sensor and described second imageing sensor.
19. one kind is pointed to navigation system, comprising:
Wide-angle sensor array module is used to produce combination image and has the combination visual angle, and described wide angle sensor array comprises:
First imageing sensor is used to gather first image;
Second imageing sensor is used to gather second image, and this second imageing sensor and described first imageing sensor have the relative space relation of lateral angles difference and/or vertical differential seat angle; And
Graphics processing unit, store transition matrix and utilize this transition matrix that described first image and described second image sets are combined into combination image, wherein said transition matrix is to try to achieve according to the relative space relation of described first imageing sensor and described second imageing sensor;
At least one light source is positioned at described combination visual angle; And
Image display device, the described wide-angle sensor array module that is of coupled connections is used for image coordinate with described light source and combines with described combination image and show described combination image and described light source image.
20. sensing navigation system according to claim 19, wherein said light source are the sight or the localizer of light gun.
CN2008101457945A 2008-08-22 2008-08-22 Wide-angle sensor array module and image correcting method, operating method and application thereof Active CN101656840B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN2008101457945A CN101656840B (en) 2008-08-22 2008-08-22 Wide-angle sensor array module and image correcting method, operating method and application thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN2008101457945A CN101656840B (en) 2008-08-22 2008-08-22 Wide-angle sensor array module and image correcting method, operating method and application thereof

Publications (2)

Publication Number Publication Date
CN101656840A CN101656840A (en) 2010-02-24
CN101656840B true CN101656840B (en) 2011-09-28

Family

ID=41710901

Family Applications (1)

Application Number Title Priority Date Filing Date
CN2008101457945A Active CN101656840B (en) 2008-08-22 2008-08-22 Wide-angle sensor array module and image correcting method, operating method and application thereof

Country Status (1)

Country Link
CN (1) CN101656840B (en)

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102782664A (en) * 2010-03-02 2012-11-14 日本电气株式会社 Coordinated operation apparatus, coordinated operation method, coordinated operation control program and apparatus coordination system
WO2012164339A1 (en) * 2011-05-27 2012-12-06 Nokia Corporation Image stitching
US10136054B2 (en) 2012-10-24 2018-11-20 Morpho, Inc. Image processing device for compositing panoramic images, image processing program and recording medium
JP6677474B2 (en) * 2015-09-29 2020-04-08 日立オートモティブシステムズ株式会社 Perimeter recognition device
CN106250839B (en) * 2016-07-27 2019-06-04 徐鹤菲 A kind of iris image perspective correction method, apparatus and mobile terminal
CN107063188A (en) * 2017-05-19 2017-08-18 深圳奥比中光科技有限公司 Big visual angle 3D vision systems
CN113001535B (en) 2019-12-18 2022-11-15 财团法人工业技术研究院 Automatic correction system and method for robot workpiece coordinate system

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1036872A (en) * 1988-03-07 1989-11-01 夏普公司 Interlocked zooming apparatus

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1036872A (en) * 1988-03-07 1989-11-01 夏普公司 Interlocked zooming apparatus

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
JP特开平11-112956A 1999.04.23
周船等.一种立体视觉系统高精度标定方法.《仪器仪表学报》.2006,第27卷(第6期),2221-2223页,2297页. *

Also Published As

Publication number Publication date
CN101656840A (en) 2010-02-24

Similar Documents

Publication Publication Date Title
CN101656840B (en) Wide-angle sensor array module and image correcting method, operating method and application thereof
US8786719B2 (en) Image calibration method and operation method for sensor array module with wide angle
US11631155B2 (en) Equatorial stitching of hemispherical images in a spherical image capture system
CN104917955B (en) A kind of conversion of image and multiple view output system and method
US20170345398A1 (en) Minimal-latency tracking and display for matching real and virtual worlds in head-worn displays
US20170127045A1 (en) Image calibrating, stitching and depth rebuilding method of a panoramic fish-eye camera and a system thereof
US8913162B2 (en) Image processing method, image processing apparatus and image capturing apparatus
US9892488B1 (en) Multi-camera frame stitching
CN110677599A (en) System and method for reconstructing 360-degree panoramic video image
CN102129505A (en) Information processing apparatus, method, and computer-readable medium
CN105894448B (en) The generation method of mask matrix, the synthetic method for image of parking and device
JPWO2005124735A1 (en) Image display system, image display method, and image display program
TW201340035A (en) Method for combining images
CN112655024A (en) Image calibration method and device
CN101491108A (en) Apparatus, method and computer program product for three-dimensional image processing
CN104464633A (en) Method and device for correcting LED display units
WO2016174942A1 (en) Information processing device, information processing method, and program
KR20210123367A (en) 360 degree wide angle camera with baseball stitch
CN100461855C (en) Video real time calibration of wide-angle lens
JP2011087319A (en) In-vehicle panorama camera system
US20120133744A1 (en) Stereoscopic image generation apparatus and method
CN105583801A (en) Robot head control system with virtual reality function
CN110900606B (en) Hand-eye linkage system based on small mechanical arm and control method thereof
US11573630B2 (en) Systems and methods for calibrating an eye tracking system
JP4218278B2 (en) Information processing system, information processing apparatus, information processing method, image processing apparatus, image processing method, and program

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant