CN117545693A - Information processing device, information processing method, and program - Google Patents

Information processing device, information processing method, and program Download PDF

Info

Publication number
CN117545693A
CN117545693A CN202180099733.3A CN202180099733A CN117545693A CN 117545693 A CN117545693 A CN 117545693A CN 202180099733 A CN202180099733 A CN 202180099733A CN 117545693 A CN117545693 A CN 117545693A
Authority
CN
China
Prior art keywords
bright
bright point
point
information processing
feature amount
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202180099733.3A
Other languages
Chinese (zh)
Inventor
松浦贤太朗
木村学
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sony Group Corp
Original Assignee
Sony Group Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sony Group Corp filed Critical Sony Group Corp
Publication of CN117545693A publication Critical patent/CN117545693A/en
Pending legal-status Critical Current

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64GCOSMONAUTICS; VEHICLES OR EQUIPMENT THEREFOR
    • B64G1/00Cosmonautic vehicles
    • B64G1/22Parts of, or equipment specially adapted for fitting in or to, cosmonautic vehicles
    • B64G1/24Guiding or controlling apparatus, e.g. for attitude control
    • B64G1/36Guiding or controlling apparatus, e.g. for attitude control using sensors, e.g. sun-sensors, horizon sensors

Landscapes

  • Engineering & Computer Science (AREA)
  • Remote Sensing (AREA)
  • Chemical & Material Sciences (AREA)
  • Combustion & Propulsion (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Image Analysis (AREA)

Abstract

Provided are a new and improved information processing device, information processing method, and program capable of improving the robustness against rotation of a camera when determining a star corresponding to a bright point. An information processing device is provided with: a detection unit that detects a plurality of bright spots from an image obtained by photographing the universe with a camera; a calculation unit configured to calculate a first feature amount showing a positional relationship of at least two or more bright spots among the plurality of bright spots detected by the detection unit; and a determining unit that determines a star corresponding to at least any one of the two or more bright points based on the first feature amount and the second feature amount calculated by the calculating unit, the second feature including a central angle in an equatorial coordinate system between stars based on the celestial positional information of the at least two or more stars.

Description

Information processing device, information processing method, and program
Technical Field
The present disclosure relates to an information processing apparatus, an information processing method, and a program.
Background
In recent years, a technique has been developed in which a star corresponding to a bright point included in an image obtained by photographing a universe is specified from the image. For example, patent document 1 discloses a technique of calculating an angle (a pitch angle) formed between bright points and a vector difference between bright points from a plurality of bright points included in an image as a feature amount and determining a star corresponding to the bright points from the feature amount.
Prior art literature
Patent literature
Patent document 1: japanese patent laid-open No. 10-72000
Disclosure of Invention
Technical problem to be solved by the invention
However, in the technique described in patent document 1, even when the same bright point is set as the determination target, the value of the feature quantity for determining the star fluctuates due to the rotation of the camera, and therefore, it is necessary to use the estimation function for estimating the rotation quantity of the camera at the same time.
Therefore, in the present disclosure, a new and improved information processing apparatus, information processing method, and program capable of improving the robustness against rotation of a camera when determining a star corresponding to a bright point are proposed.
Solution for solving the technical problems
According to the present disclosure, there is provided an information processing apparatus including: a detection unit that detects a plurality of bright spots from an image obtained by photographing the universe with a camera; a calculation unit configured to calculate a first feature amount showing a positional relationship of at least two or more bright spots among the plurality of bright spots detected by the detection unit; and a determining unit that determines a star corresponding to at least any one of the two or more bright points based on the first feature amount and the second feature amount calculated by the calculating unit, the second feature amount including a central angle between stars in an equatorial coordinate system based on position information of each celestial body of the at least two or more stars.
In addition, according to the present disclosure, there is provided an information processing method performed by a computer, the information processing method including: detecting a plurality of bright spots from an image obtained by photographing the universe with a camera; calculating a first feature quantity showing a positional relationship of at least two or more bright spots among the detected plurality of bright spots; and determining a star corresponding to at least any one of the two or more bright points based on the calculated first feature quantity and second feature quantity, the second feature quantity including a central angle in an equatorial coordinate system between stars based on the celestial body position information of the at least two or more stars.
In addition, according to the present disclosure, there is provided a program that causes a computer to realize the functions of: a detection function of detecting a plurality of bright spots from an image obtained by photographing a universe with a camera; a calculation function of calculating a first feature quantity showing a positional relationship of at least two or more bright spots among the plurality of bright spots detected by the detection function; and a determining function of determining a star corresponding to at least any one of the two or more bright points based on the first feature amount and the second feature amount calculated by the calculating function, the second feature amount including a central angle in an equatorial coordinate system between the stars based on the celestial positional information of the at least two or more stars.
Drawings
Fig. 1 is an explanatory diagram for explaining an example of an information processing system according to the present disclosure.
Fig. 2 is an explanatory diagram for explaining an exemplary functional configuration of the information processing apparatus 10 according to the present disclosure.
Fig. 3 is an explanatory diagram for explaining an example of the database held by the star feature amount storage unit 101.
Fig. 4A is an explanatory diagram for explaining an example of selection of the first bright point according to the present disclosure.
Fig. 4B is an explanatory diagram for explaining an example of selection of the second bright point according to the present disclosure.
Fig. 4C is an explanatory diagram for explaining an example of selection of the third bright point according to the present disclosure.
Fig. 4D is an explanatory diagram for explaining an example of selection of the fourth bright point according to the present disclosure.
Fig. 4E is an explanatory diagram for explaining an example of selection of the fifth bright point according to the present disclosure.
Fig. 5 is an explanatory diagram for explaining an example of the feature quantity calculated by the feature quantity calculating unit 205.
Fig. 6 is an explanatory diagram for explaining an example of the database in which the star group and the feature quantity are associated and held by the star group feature quantity storage unit 105.
Fig. 7 is an explanatory diagram for explaining an example of a method of calculating the great circle distance D1 based on celestial body position information.
Fig. 8 is an explanatory diagram for explaining an example of a method of calculating the formed angle AD1 based on the celestial body position information.
Fig. 9 is an explanatory diagram for explaining an example of operation processing of the information processing apparatus 10 according to the present disclosure.
Fig. 10 is a block diagram showing a hardware configuration of the information processing apparatus 10 according to the present disclosure.
Detailed Description
Preferred embodiments of the present disclosure will be described in detail below with reference to the accompanying drawings. In the present specification and the drawings, constituent elements having substantially the same functional constitution are denoted by the same reference numerals, and duplicate descriptions thereof are omitted.
The description of the "specific embodiment" will be given in terms of the order of items shown below.
1. Summary of information processing System
2. Functional configuration example of information processing apparatus 10
3. Detailed description
3.1. Calculation of feature quantity
3.2. Preparation in advance
3.3. Determination of sidereal
3.4. Attitude estimation
4. Action processing example
5. Example of action and Effect
6. Hardware configuration example
7. Supplement and supplement
Summary of information processing System
As an embodiment of the present disclosure, a mechanism of improving robustness against rotation of a camera when determining a star corresponding to a bright point is described.
Fig. 1 is an explanatory diagram for explaining an example of an information processing system according to the present disclosure. The information processing system according to the present disclosure includes an artificial satellite 1, a camera 5, and an information processing apparatus 10.
(Artificial satellite 1)
The satellite 1 is a spacecraft that exists outside the atmosphere of the earth. The satellite 1 may be, for example, an orbiting satellite or a geostationary satellite, or a space detector that performs tasks outside the orbit around the earth. The size of the satellite 1 is not limited. For example, the satellite 1 may be a large spacecraft such as an international space station or a small spacecraft such as CanSat.
The satellite 1 further includes a camera 5 and an information processing device 10. In the present description, an example in which the camera 5 and the information processing apparatus 10 are mounted on the satellite 1 separately will be described, but the information processing apparatus 10 may be provided with the camera 5.
(Camera 5)
The camera 5 photographs the universe and acquires an image including the star S. In addition, the camera 5 outputs the acquired image to the information processing apparatus 10. In the following description, the star S in the image obtained by the photographing of the camera 5 may be expressed as a bright point.
The information processing apparatus 10 detects a plurality of bright spots from the image input from the camera 5. Further, the information processing apparatus 10 calculates a feature amount showing a positional relationship of at least two or more bright spots, and determines a star corresponding to at least any one of the two or more bright spots for calculating the feature amount based on the calculated feature amount.
The above describes an example of the information processing system according to the present disclosure. Next, a functional configuration example of the information processing apparatus 10 according to the present disclosure will be described with reference to fig. 2.
Function configuration example of information processing apparatus 10
Fig. 2 is an explanatory diagram for explaining an exemplary functional configuration of the information processing apparatus 10 according to the present disclosure. As shown in fig. 2, the information processing apparatus 10 according to the present disclosure includes a storage unit 100 and a control unit 200.
(storage section 100)
The storage unit 100 holds software and various data. As shown in fig. 2, the storage unit 100 includes a star feature amount storage unit 101 and a star group feature amount storage unit 105.
{ fixed star feature quantity storage section 101 })
The fixed star feature amount storage 101 holds a database of celestial body position information associated with fixed stars and fixed stars. An example of the database held by the star feature amount storage 101 will be described with reference to fig. 3.
Fig. 3 is an explanatory diagram for explaining an example of the database held by the star feature amount storage unit 101. The star feature amount storage unit 101 holds a database from which various information included in, for example, an eba star table is extracted. The various information includes, for example, the HIP number and the right ascension of the star represented by each HIP number.
The HIP number refers to an identification number assigned to each star. The right ascension and the right ascension are values indicating the position of the star in the equatorial coordinate system. In the following description, the right ascent and the right ascent may be collectively referred to as celestial body position information.
An example of the database held by the star feature amount storage 101 is described above. Referring again to fig. 2, a description will be given of a functional configuration example of the information processing apparatus 10.
{ Star group feature quantity storage portion 105 })
The star group feature quantity storage 105 holds a database in which at least two or more stars held in the star feature quantity storage 101 and feature quantities including central angles between the two or more stars in an equatorial coordinate system are associated. Details of the database held in the star feature quantity storage 105 will be described later. The feature value held in the star feature value storage 105 is an example of the second feature value.
(control section 200)
The control unit 200 controls all operations of the information processing apparatus 10 according to the present disclosure. As shown in fig. 2, the control unit 200 includes a bright point detection unit 201, a feature amount calculation unit 205, a star determination unit 209, and a posture estimation unit 213.
{ bright spot detection section 201 })
The bright point detection unit 201 detects a plurality of bright points from an image obtained by photographing the universe with the camera 5. The plurality of bright spots may be all the bright spots included in the image, or may be bright spots of a predetermined luminance value or more among all the bright spots.
{ feature amount calculation section 205 })
The feature amount calculation section 205 calculates, as a first feature amount, a feature amount indicating a positional relationship of at least two or more of the bright points detected by the bright point detection section 201. Details concerning the calculation processing of the feature amount will be described later.
{ fixed star determining section 209 })
The star determining section 209 determines a star corresponding to at least any one of two or more bright points used for the feature amount calculation section 205 to calculate the feature amount, based on the feature amount calculated by the feature amount calculation section 205 and the feature amount held by the star group feature amount storage section 105.
{ posture estimation section 213 })
The posture estimating section 213 estimates the posture of the camera 5 based on celestial body position information of the star corresponding to each of the at least three or more bright points determined by the star determining section 209. Details concerning the attitude estimation will be described later.
The functional configuration example of the information processing apparatus 10 according to the present disclosure is described above. Next, details relating to the present disclosure will be described in order with reference to fig. 4 to 12.
< 3 >, detailed description
< 3.1. Calculation of feature quantity >
The feature amount calculation unit 205 calculates a feature amount showing a positional relationship of at least two or more bright spots among the plurality of bright spots detected by the bright spot detection unit 201. Then, the star determining section 209 compares the feature quantity calculated from the image positions of the two or more bright spots calculated by the feature quantity calculating section 205 with the feature quantity calculated from celestial body position information of the two or more stars (stars) held in the stargroup feature quantity storing section 105, and determines a star corresponding to each of the two or more bright spots based on the result of the comparison.
The selection criterion for selecting two or more bright spots from among the plurality of bright spots included in the image is arbitrary, but the selection criterion for two or more stars held as a star group is preferably the same as the selection criterion for two or more bright spots in the feature amount calculation section 205. An example of selecting five bright spots from among a plurality of bright spots included in an image as a selection criterion for one of at least two or more bright spots will be described below with reference to fig. 4A to 4E.
(selection of bright spots)
{ selection of first bright Point }
Fig. 4A is an explanatory diagram for explaining an example of selection of the first bright point according to the present disclosure. In fig. 4A, each circle represents a bright point. The feature amount calculation section 205 selects, as the first bright point, a bright point satisfying a predetermined condition among the plurality of bright points detected by the bright point detection section 201.
For example, the feature amount calculation unit 205 may select, as the first bright point, a bright point P1 closest to the center position in the image I as shown in fig. 4A among all the bright points included in the image I. However, the bright point closest to the center position in the image I is an example of a bright point satisfying the above-described predetermined condition, and the predetermined condition according to the present disclosure is not limited to this example.
For example, the feature amount calculation unit 205 may select, as the bright point (i.e., the first bright point) satisfying the predetermined condition, the bright point having the largest luminance value among all the bright points included in the image I, or may select, as the bright point satisfying the predetermined condition, the bright point closest to the predetermined position in the image I.
{ selection of second Bright Point }
Fig. 4B is an explanatory diagram for explaining an example of selection of the second bright point according to the present disclosure. The feature quantity calculating section 205 selects, as the second bright point, a bright point P2 closest to the bright point P1 among the bright points of the image I that are equal to or more than the bright point P1 of the first bright point by the predetermined value R.
{ selection of third bright Point }
Fig. 4C is an explanatory diagram for explaining an example of selection of the third bright point according to the present disclosure. The feature quantity calculating section 205 selects, as a third bright point, a bright point next to the second bright point, which is closest to the first bright point from among bright points in the predetermined range, which exist from the symmetric direction of the second bright point with the first bright point as a starting point in the image I.
For example, the symmetrical direction of the second bright point starting from the first bright point is the direction in which a virtual line having an angle of 180 degrees formed with a line segment connecting the first bright point and the second bright point extends from the bright point P1. In this case, as shown in fig. 4C, the feature quantity calculating section 205 selects, as the third bright point, a bright point P3 next to the bright point P2 from the bright point P1 among the bright points included in the range from 90 ° to 270 ° from the line segment connecting the bright point P1 and the bright point P2 with the bright point P1 as the start point (i.e., the range on the side where the bright point P2 is not located out of the two ranges divided by the broken line L1).
{ selection of fourth bright Point }
Fig. 4D is an explanatory diagram for explaining an example of selection of the fourth bright point according to the present disclosure. The feature quantity calculating section 205 selects, as a fourth bright point, a bright point next to the third bright point from among bright points existing in a predetermined range from the symmetric direction of the third bright point with the first bright point as a starting point in the image I.
For example, the symmetrical direction of the third bright point starting from the first bright point is the direction in which a virtual line having an angle of 180 degrees formed with a line segment connecting the first bright point and the third bright point extends from the bright point P1. In this case, as shown in fig. 4D, the feature quantity calculating section 205 selects, as the fourth bright point, a bright point P4 next to the bright point P3 from the bright point P1 among the bright points included in the range from 90 ° to 270 ° with the bright point P1 as the start point and the line segment connecting the bright point P1 and the bright point P3 (i.e., the range on the side where the bright point P3 is not located out of the two ranges divided by the broken line L2).
{ selection of fifth bright Point }
Fig. 4E is an explanatory diagram for explaining an example of selection of the fifth bright point according to the present disclosure. The feature quantity calculating section 205 selects, as a fifth bright point, a bright point next to the fourth bright point from among bright points existing in a predetermined range from the symmetry direction of the fourth bright point with the first bright point as a starting point in the image I.
For example, the symmetrical direction of the fourth bright point starting from the first bright point is the direction in which a virtual line having an angle of 180 degrees formed with a line segment connecting the first bright point and the fourth bright point extends from the bright point P1. In this case, as shown in fig. 4E, the feature quantity calculating section 205 selects, as the fifth bright point, a bright point P5 next to the bright point P4 from the bright point P1 among the bright points included in the range from 90 ° to 270 ° of the angle formed by the bright point P1 as the start point and the line segment connecting the bright point P1 and the bright point P4 (i.e., the range on the side where the bright point P4 is not located in the two ranges divided by the broken line L3).
An example of a method of selecting the first to fifth bright points is described above. One example of the selection method is a method of selecting the bright spots dispersedly located in each direction as the second to fifth bright spots with the first bright spot as the center. Accordingly, the star specifying unit 209 described later can specify the star corresponding to each of the first to fifth bright points with higher accuracy. Next, the feature amount calculation unit 205 calculates the feature amount using the first to fifth bright points in detail with reference to fig. 5.
(calculation of feature quantity)
Fig. 5 is an explanatory diagram for explaining an example of the feature quantity calculated by the feature quantity calculating unit 205. As described above, the feature amount calculation section 205 according to the present disclosure calculates the feature amount showing the positional relationship of at least two or more bright spots. In the following description, the position of the bright point in the image is sometimes expressed as an image position.
For example, the feature quantity calculated by the feature quantity calculating section 205 includes a great circle distance based on the image position of each of at least two or more bright points on the assumption that the bright point is a star. In the present specification, the distance of the large circle is more specifically the center angle of each bright point when the camera 5 is the starting point, but for convenience of explanation, the distance of the large circle may be simply expressed in the present specification.
For example, the feature amount calculation section 205 calculates the large circle distance D1 based on the position of the bright point P1 and the position of the bright point P2 in the image. In addition, the feature amount calculation section 205 calculates the large circle distance D2 based on the image position of the bright point P1 and the image position of the bright point P3. In addition, the feature amount calculation section 205 calculates the large circle distance D3 based on the image position of the bright point P1 and the image position of the bright point P4. Further, the feature amount calculation section 205 calculates the large circle distance D4 based on the image position of the bright point P1 and the image position of the bright point P5. The feature value calculation unit 205 may perform the normalization conversion process while calculating the great circle distances D1 to D4, for example, to convert the values of the great circle distances D1 to D4 so that the maximum value is 1 and the minimum value is 0.
The feature amount calculation unit 205 may calculate angles at which three or more bright spots are formed as feature amounts. For example, the feature quantity calculated by the feature quantity calculating section 205 may include an angle formed by a line segment connecting the first bright point and the second bright point and a line segment connecting the first bright point and the third bright point as an angle formed by three or more bright points. For example, the angle formed by three or more bright spots may be an angle AD1 formed by a line segment connecting the bright spots P1 and P2 and a line segment connecting the bright spots P1 and P3 as shown in fig. 5.
The feature quantity calculated by the feature quantity calculating unit 205 may include an angle formed by a line segment connecting the first bright point and the third bright point and a line segment connecting the first bright point and the fourth bright point as an angle formed by three or more bright points. For example, the angle formed by three or more bright spots may be an angle AD2 formed by a line segment connecting the bright spots P1 and P3 and a line segment connecting the bright spots P1 and P4 as shown in fig. 5.
As described above, when five bright spots are selected in total from the first to fifth bright spots, the feature amount calculation unit 205 can calculate the large circle distances D1 to D4 and the formed angles AD1 and AD2 as feature amounts, respectively. Then, the star determiner 209 determines a star corresponding to at least any one of the first to fifth bright points based on each of the feature amounts calculated by the feature amount calculator 205. More specifically, the star determiner 209 determines a star corresponding to at least any one of the first to fifth bright points based on the feature amounts calculated by the feature amount calculator 205 and the feature amounts held in the star group feature amount storage 105. Next, an example of a database in which the star group and the feature quantity are associated and held by the star group feature quantity storage unit 105 will be described.
< 3.2. Preparation in advance >
Fig. 6 is an explanatory diagram for explaining an example of the database in which the star group and the feature quantity are associated and held by the star group feature quantity storage unit 105. As shown in fig. 6, the star group feature quantity storage 105 associates and holds the number C1, the formed angle C2, and the great circle distance C3 of each star included in each star group.
Note that, for example, the HIP number shown in fig. 3 is input to the number C1. For example, for each star in the column of the number N1, the number of the star serving as the reference is input as the first bright point. Then, when the above-described selection criteria of the second to fifth bright points are applied, the star corresponding to the second bright point is input to the number N2, the star corresponding to the third bright point is input to the number N3, the star corresponding to the fourth bright point is input to the number N4, and the star corresponding to the fifth bright point is input to the number N5, respectively. Therefore, the star group feature amount storage 105 holds, as the database, the star and each feature amount conforming to each of the second to fifth bright points when a certain star is the first bright point.
As in the case where the feature value calculation unit 205 performs the normalization conversion process while calculating the great circle distance, the great circle distance C3 held by the star group feature value storage unit 105 may be converted such that the maximum value is 1 and the minimum value is 0.
Each feature held by the star group feature storage 105 is calculated based on celestial body position information (right ascent and right ascent) of the star held by the star feature storage 101. An example of a method of calculating the feature amount based on the celestial body position information will be described below with reference to fig. 7 and 8.
Fig. 7 is an explanatory diagram for explaining an example of a method of calculating the great circle distance D1 based on celestial body position information. In fig. 7, an example of a method of calculating the great circle distance D1 between the star S1 and the star S2 (more precisely, the center angle formed by the arcs connecting the star S1 and the star S2) will be described.
The large circular distance D1 between the star S1 and the star S2 is calculated using the right ascension RA and the right ascension DE of each of the star S1 and the star S2. For example, the right ascension of the star S1 is RA1, the right ascension is DE1, the right ascension of the star S2 is RA2, and the right ascension is DE2. In this case, the great circle distance D1 between the stars S1 and the stars S2 can be calculated using the following equation (number 1).
[ number 1]
Next, an example of a method of estimating the angle formed by three stars will be described with reference to fig. 8.
Fig. 8 is an explanatory diagram for explaining an example of a method of calculating the formed angle AD1 based on the celestial body position information. In fig. 8, an example of a method of calculating the angle AD1 formed by the line segment connecting the star S1 and the star S2 and the line segment connecting the star S1 and the star S3 will be described.
Using the great circle distance Y11 of the star S1 and the star S2, the great circle distance Y12 of the star S1 and the star S3 and the great circle distance Y of the star S2 and the star S3 S23 The angle AD1 formed is calculated. More specifically, the formed angle AD1 is calculated using the following equation (number 2).
[ number 2]
The database held by the star group feature quantity storage unit 105 described above may be prepared in advance on the ground or may be updated appropriately according to the imaging situation in the space.
Next, details of processing performed by the star specifying unit 209 to specify the star corresponding to the bright point will be described.
< 3.3. Determination of fixed star >
The star determining section 209 determines a star corresponding to at least any one of the five bright points used for the feature amount calculation section 205 to calculate the feature amount, based on the feature amount calculated by the feature amount calculation section 205 and the feature amount held by the star group feature amount storage section 105.
For example, the star determining section 209 compares the four large circle distances and two formed angles calculated by the feature amount calculating section 205 from the five bright points with the four large circle distances and two formed angles for each constellation held by the constellation feature amount storing section 105. Then, the star determining section 209 determines a star group including stars corresponding to the five bright points based on the result of the comparison, and determines stars corresponding to each of the five bright points.
For example, when external interference such as noise occurs in the camera 5 or when an obstacle such as space debris is included in the field angle, there is a possibility that the bright point that is originally reflected on the image may not be reflected. Depending on the influence of such a photographing environment, the sidereal specifying unit 209 may not necessarily specify sidereal corresponding to all of the five bright points. For example, the star determining unit 209 may determine a star corresponding to at least any one of the five bright points.
The details of the processing of the sidereal specifying unit 209 for specifying the sidereal corresponding to the bright point are described above. Next, details of the process of estimating the pose of the camera 5 by the pose estimating unit 213 will be described.
< 3.4. Pose estimation >)
The posture estimating section 213 estimates the posture of the camera 5 based on the celestial body position information of the star corresponding to each of the at least three or more bright points determined by the star determining section 209.
First, the posture estimating unit 213 calculates Xw coordinates, yw coordinates, and Zw coordinates of the stars identified as stars corresponding to the bright points in the world coordinate system W (Xw, yw, zw) by the following expressions (number 3) to (number 5) using the right ascent RA and the right ascent DE of each star.
[ number 3]
X w = - { cos (DE) ×cos (RA) } (number 3)
[ number 4]
Y w = - { cos (DE) ×sin (RA) } (number 4)
[ number 5]
Z w = -sin (DE) (number 5)
Then, the posture estimating unit 213 calculates the world coordinate system W (Xw, yw, zw) of the star and the posture (R) of the camera 5 by the following expression (number 6) 11 ~R 33 ) And the image position (u, v) of the bright point.
[ number 6]
Then, the posture estimating unit 213 calculates a relational expression based on the above-described mathematical expression (number 6) for each bright point of the star determined by the star determining unit 209.
Then, the pose estimation unit 213 estimates the pose (R) of the camera 5 based on the obtained at least three or more relational expressions 11 ~R 33 ). In this case, the pose estimation unit 213 may estimate the pose (R) of the camera 5 based on at least three or more relational expressions and statistical techniques 11 ~R 33 ). The statistical method may be, for example, a least square method.
The details relating to the present disclosure are described above. Next, an operation processing example of the information processing apparatus 10 according to the present disclosure will be described in order with reference to fig. 9.
< 4. Action processing example >)
Fig. 9 is an explanatory diagram for explaining an example of operation processing of the information processing apparatus 10 according to the present disclosure. First, the control unit 200 acquires an image obtained by photographing the universe from the camera 5 (S101).
Then, the bright point detection section 201 detects a plurality of bright points from the image acquired from the camera 5 (S105).
Next, the feature amount calculation unit 205 calculates a feature amount showing a positional relationship of at least two or more bright spots (S109).
Next, the star determiner 209 compares the feature amounts of at least two or more bright spots obtained from the image with the feature amounts of each of the plurality of stars held in the stars feature amount storage 105 (S113).
Then, the sidereal determining unit 209 determines a sidereal corresponding to the bright point whose feature value has been calculated by the feature value calculating unit 205 based on the comparison result of S113 (S117).
Then, the posture estimating unit 213 estimates the posture of the camera 5 based on the celestial body position information of each of the at least three or more stars determined in S117 (S121), and the information processing device 10 according to the present disclosure ends the processing.
The operation processing example according to the present disclosure is described above. Next, an example of the operation and effect of the present disclosure will be described.
< 5. Working Effect example >
According to the present disclosure described above, various operational effects can be obtained. For example, the information processing apparatus 10 according to the present disclosure can determine a star corresponding to a bright point with high accuracy even in a photographing environment in which the posture of the camera 5 is changeable by using a large circle distance as a feature amount having high robustness to the rotation and the proportion of the camera 5. The size of the feature amount database stored in the star group feature amount storage 105 may be limited to about 500KB. Thus, the functions of the information processing apparatus 10 can be realized even in a small-sized apparatus such as a microcomputer, and thus, the cost can be reduced. As a result, the information processing apparatus 10 can be mounted on a small satellite, for example.
In addition, for the second to fifth bright points, the feature amount calculation section 205 selects not only bright points in one area of the image, respectively, but bright points dispersed in each direction within the image, whereby the star determination section 209 can determine the star corresponding to each of the first to fifth bright points with higher accuracy.
Further, the posture estimating unit 213 according to the present disclosure estimates a relational expression relating to the posture of the camera 5 and the image position of the bright point based on the celestial body position information of the star corresponding to the specified one bright point. Then, the posture estimating unit 213 estimates the posture of the camera 5 based on a statistical technique such as a least square method and a relational expression estimated from each of the three or more bright points determined by the star determining unit 209. Thus, the pose estimation unit 213 can uniquely estimate the pose of the camera 5 without calculating the candidate value.
< 6 hardware configuration example >)
The embodiments according to the present disclosure are described above. The above-described information processing is realized by cooperation of software and hardware of the information processing apparatus 10 described below.
Fig. 10 is a block diagram showing a hardware configuration of the information processing apparatus 10 according to the present disclosure. The information processing apparatus 10 includes a CPU (Central Processing Unit: central processing unit) 1001, a ROM (Read Only Memory) 1002, a RAM (Random Access Memory: random access Memory) 1003, and a host bus 1004. The information processing apparatus 10 further includes a bridge 1005, an external bus 1006, an interface 1007, an input device 1008, an output device 1010, a storage device (HDD) 1011, a drive 1012, and a communication device 1015.
The CPU1001 functions as an arithmetic processing device and a control device, and controls all operations in the information processing device 30 according to various programs. The CPU1001 may be a microprocessor. The ROM1002 stores programs, operation parameters, and the like used by the CPU 1001. The RAM1003 temporarily stores programs used in execution of the CPU1001, parameters appropriately changed in the execution, and the like. These are connected to each other via a host bus 1004 constituted by a CPU bus or the like. The functions of the bright point detection unit 201, the feature amount calculation unit 205, the star determination unit 209, and the like described with reference to fig. 2 can be realized by cooperation of the CPU1001, the ROM1002, and the RAM1003 with software.
The host bus 1004 is connected to an external bus 1006 such as a PCI (Peripheral Component Interconnect/Interface: external device interconnect/Interface) bus through a bridge 1005. It should be noted that the host bus 1004, the bridge 1005, and the external bus 1006 are not necessarily separately configured, and these functions may be mounted on one bus.
The input device 1008 is configured by an input unit for inputting information by a user such as a mouse, a keyboard, a touch panel, a button, a microphone, a switch, and a joystick, an input control circuit that generates an input signal based on an input of the user, and outputs the input signal to the CPU1001, and the like. By operating the input device 1008, the user of the information processing apparatus 10 can input various data or instruct processing operations to the information processing apparatus 10.
The output device 1010 includes, for example, a display device such as a liquid crystal display device, an OLED device, and a lamp. The output device 1010 includes sound output devices such as a speaker and a headphone. The output device 1010 outputs reproduced content, for example. Specifically, the display device displays various information such as reproduced video data in the form of text or pictures. On the other hand, the sound output device converts reproduced sound data or the like into sound and outputs the sound.
The storage device 1011 is a device for data storage. The storage device 1011 may include a storage medium, a recording device for recording data on the storage medium, a reading device for reading data from the storage medium, a deleting device for deleting data recorded on the storage medium, and the like. The storage device 1011 is constituted of, for example, an HDD (Hard Disk Drive). The storage device 1011 drives a hard disk, and stores various data and programs executed by the CPU 1001.
The drive 1012 is a reader/writer for a storage medium, and is built in or externally connected to the information processing apparatus 10. The drive 1012 reads information recorded on the removable storage medium 60 such as a mounted magnetic disk, optical disk, magneto-optical disk, or semiconductor memory, and outputs the information to the RAM1003. In addition, the drive 1012 is also capable of writing information to the removable storage medium 60.
The communication device 1015 is, for example, a communication interface constituted by a communication apparatus or the like for connecting to the network 50. The communication device 1015 may be a wireless LAN communication device, an LTE (Long Term Evolution: long term evolution) communication device, or a wire communication device that performs wired communication.
The hardware configuration examples related to the present disclosure are described above. Next, a description will be given of a supplement to the present disclosure.
< 7. Supplement >
The preferred embodiments of the present disclosure have been described in detail above with reference to the accompanying drawings, but the present disclosure is not limited to this example. As a matter of course, various modifications and corrections can be made by those having ordinary knowledge in the art to which the present disclosure pertains within the scope of the technical idea described in the claims, and these are naturally considered to be within the technical scope of the present disclosure.
For example, in the present specification, an example has been described in which the feature amount calculating section 205 selects five bright points of the first to fifth bright points and calculates the feature amount from the five bright points, but when determining the star corresponding to the bright point, the feature amount calculating section 205 does not necessarily need to select five bright points and calculate the feature amount from the five bright points. For example, when determining the star corresponding to the bright points, the bright points required for calculating the feature quantity may be at least two or more. In addition, when estimating the pose of the camera 5, at least three or more bright spots are required for calculating the feature quantity. The information processing device 10 according to the present disclosure may change the number of selected bright spots and the number of calculated feature amounts as appropriate according to the application and the required accuracy.
In the present description, an example in which the camera 5 and the information processing apparatus 10 are mounted on the satellite 1 is described, but the camera 5 and the information processing apparatus 10 may be mounted on any device on the earth.
The feature amount calculation unit 205 may increase or decrease the number of feature amounts calculated according to the appearance of the bright spots included in the image. For example, in the case where the bright spots included in the image are blurred, the feature amount calculation section 205 may also increase the number of the selected bright spots and calculate more feature amounts. Thus, even when an image is generated in the form of a bright spot due to the sensitivity and performance of the camera 5, the star specifying unit 209 can specify the star corresponding to the bright spot with higher accuracy.
The steps in the processing of the information processing apparatus 10 in the present specification are not necessarily required to be processed in time series in the order described as a flowchart. For example, the steps in the processing of the information processing apparatus 10 may be processed in a different order from the order described as the flowchart or may be processed in parallel.
Further, a computer program for performing functions equivalent to the respective components of the information processing apparatus 10 can be produced in hardware such as a CPU, a ROM, and a RAM incorporated in the information processing apparatus 10. In addition, a non-transitory storage medium storing the computer program is also provided.
The effects described in the present specification are merely illustrative or exemplary, and are not intended to be limiting. That is, the technology according to the present disclosure can exert the above-described effects, and can also exert other effects obvious to those skilled in the art from the description of the present specification, in addition to or instead of the above-described effects.
The following structures also fall within the technical scope of the present disclosure.
An information processing device is provided with:
a detection unit that detects a plurality of bright spots from an image obtained by photographing the universe with a camera;
a calculation unit configured to calculate a first feature amount showing a positional relationship of at least two or more bright spots among the plurality of bright spots detected by the detection unit; and
and a determining unit configured to determine a star corresponding to at least any one of the two or more bright points based on the first feature amount and the second feature amount calculated by the calculating unit, the second feature amount including a central angle between stars in an equatorial coordinate system based on the celestial positional information of the at least two or more stars.
(2) The information processing apparatus according to the item (1), wherein,
the calculation unit calculates the first feature amount based on an image position of a first bright point, which is a bright point satisfying a predetermined condition among the detected plurality of bright points, and an image position of a second bright point, which is a bright point closest to the image position of the first bright point among bright points that are more than or equal to a predetermined value from the image position of the first bright point.
(3) The information processing apparatus according to the item (2), wherein,
the first feature quantity includes the center angle between each bright point based on an image position of each of the at least two or more bright points assuming that the bright point is a star.
(4) The information processing apparatus according to the item (2) or the item (3), wherein,
the at least more than two bright spots are more than three bright spots,
the information processing apparatus further includes a posture estimating unit that estimates a posture of the camera based on the celestial body position information of the star corresponding to each of the three or more bright points determined by the determining unit.
(5) The information processing apparatus according to the item (4), wherein,
the calculating section calculates the first feature amount based on an image position of a third bright point which is a bright point next to the second bright point from among bright points whose symmetry direction from the second bright point exists in a predetermined range with the first bright point as a starting point, the first bright point being close to the first bright point.
(6) The information processing apparatus according to the item (5), wherein,
the first characteristic amount includes angles formed by the three or more bright spots,
The second feature quantity includes angles formed by the three or more stars based on the center angles of the three or more stars.
(7) The information processing apparatus according to the item (6), wherein,
the first feature quantity includes an angle formed by a line segment connecting the first bright point and the second bright point and a line segment connecting the first bright point and the third bright point as an angle formed by the three or more bright points.
(8) The information processing apparatus according to the item (7), wherein,
the three or more bright spots are four or more bright spots,
the calculating section calculates the first feature amount based on an image position of a fourth bright point which is a bright point next to the third bright point from among bright points whose symmetry direction of the third bright point exists in a predetermined range with the first bright point as a starting point, the first bright point being close to the first bright point.
(9) The information processing apparatus according to the item (8), wherein,
the first feature quantity includes an angle formed by a line segment connecting the first bright point and the third bright point and a line segment connecting the first bright point and the fourth bright point as angles formed by the three or more bright points.
(10) The information processing apparatus according to the item (9), wherein,
the four or more bright spots are five or more bright spots,
the calculating section calculates the first feature amount based on an image position of a fifth bright point, which is a bright point next to the fourth bright point from among bright points whose symmetry direction exists in a predetermined range from the fourth bright point with the first bright point as a starting point, and an image position of the first bright point.
(11) The information processing apparatus according to any one of the above (4) to (10), wherein,
the posture estimating unit estimates a relational expression relating to a posture of the camera and an image position of the bright point based on celestial body position information of a star corresponding to the one bright point determined by the determining unit, and estimates the posture of the camera based on the relational expression and a statistical technique estimated from each of the three or more bright points determined by the determining unit.
(12) An information processing method performed by a computer, the information processing method comprising:
detecting a plurality of bright spots from an image obtained by photographing the universe with a camera;
calculating a first feature quantity showing a positional relationship of at least two or more bright spots among the detected plurality of bright spots; and
And determining a star corresponding to at least any one of the two or more bright points based on the calculated first feature quantity and second feature quantity, wherein the second feature quantity comprises a central angle between stars in an equatorial coordinate system based on the celestial body position information of the at least two or more stars.
(13) A program for causing a computer to realize the functions of:
a detection function of detecting a plurality of bright spots from an image obtained by photographing a universe with a camera;
a calculation function of calculating a first feature quantity showing a positional relationship of at least two or more bright spots among the plurality of bright spots detected by the detection function; and
and a determining function of determining a star corresponding to at least any one of the two or more bright points based on the first feature amount and the second feature amount calculated by the calculating function, the second feature amount including a central angle in an equatorial coordinate system between the stars based on the celestial positional information of the at least two or more stars.
Description of the reference numerals
1. Artificial satellite
5. Camera with camera body
10. Information processing apparatus
100. Storage unit
101. Star feature quantity storage unit
105. Star group feature quantity storage unit
200. Control unit
201. Bright spot detecting part
205. Feature quantity calculating unit
209. Fixed star determining part
213. And an attitude estimation unit.

Claims (13)

1. An information processing device is provided with:
a detection unit that detects a plurality of bright spots from an image obtained by photographing the universe with a camera;
a calculation unit configured to calculate a first feature amount showing a positional relationship of at least two or more bright spots among the plurality of bright spots detected by the detection unit; and
and a determining unit configured to determine a star corresponding to at least any one of the two or more bright points based on the first feature amount and the second feature amount calculated by the calculating unit, the second feature amount including a central angle between stars in an equatorial coordinate system based on the celestial positional information of the at least two or more stars.
2. The information processing apparatus according to claim 1, wherein,
the calculation unit calculates the first feature amount based on an image position of a first bright point, which is a bright point satisfying a predetermined condition among the detected plurality of bright points, and an image position of a second bright point, which is a bright point closest to the image position of the first bright point among bright points that are more than or equal to a predetermined value from the image position of the first bright point.
3. The information processing apparatus according to claim 2, wherein,
the first feature quantity includes the center angle between each bright point based on an image position of each of the at least two or more bright points assuming that the bright point is a star.
4. The information processing apparatus according to claim 3, wherein,
the at least more than two bright spots are more than three bright spots,
the information processing apparatus further includes a posture estimating unit that estimates a posture of the camera based on the celestial body position information of the star corresponding to each of the three or more bright points determined by the determining unit.
5. The information processing apparatus according to claim 4, wherein,
the calculating section calculates the first feature amount based on an image position of a third bright point which is a bright point next to the second bright point from among bright points whose symmetry direction from the second bright point exists in a predetermined range with the first bright point as a starting point, the first bright point being close to the first bright point.
6. The information processing apparatus according to claim 5, wherein,
the first characteristic amount includes angles formed by the three or more bright spots,
The second feature quantity includes angles formed by the three or more stars based on the center angles of the three or more stars.
7. The information processing apparatus according to claim 6, wherein,
the first feature quantity includes an angle formed by a line segment connecting the first bright point and the second bright point and a line segment connecting the first bright point and the third bright point as an angle formed by the three or more bright points.
8. The information processing apparatus according to claim 7, wherein,
the three or more bright spots are four or more bright spots,
the calculating section calculates the first feature amount based on an image position of a fourth bright point which is a bright point next to the third bright point from among bright points whose symmetry direction of the third bright point exists in a predetermined range with the first bright point as a starting point, the first bright point being close to the first bright point.
9. The information processing apparatus according to claim 8, wherein,
the first feature quantity includes an angle formed by a line segment connecting the first bright point and the third bright point and a line segment connecting the first bright point and the fourth bright point as angles formed by the three or more bright points.
10. The information processing apparatus according to claim 9, wherein,
the four or more bright spots are five or more bright spots,
the calculating section calculates the first feature amount based on an image position of a fifth bright point, which is a bright point next to the fourth bright point from among bright points whose symmetry direction exists in a predetermined range from the fourth bright point with the first bright point as a starting point, and an image position of the first bright point.
11. The information processing apparatus according to claim 10, wherein,
the posture estimating unit estimates a relational expression relating to a posture of the camera and an image position of the bright point based on celestial body position information of a star corresponding to the one bright point determined by the determining unit, and estimates the posture of the camera based on the relational expression and a statistical technique estimated from each of the three or more bright points determined by the determining unit.
12. An information processing method performed by a computer, the information processing method comprising:
detecting a plurality of bright spots from an image obtained by photographing the universe with a camera;
calculating a first feature quantity showing a positional relationship of at least two or more bright spots among the detected plurality of bright spots; and
And determining a star corresponding to at least any one of the two or more bright points based on the calculated first feature quantity and second feature quantity, wherein the second feature quantity comprises a central angle between stars in an equatorial coordinate system based on the celestial body position information of the at least two or more stars.
13. A program for causing a computer to realize the functions of:
a detection function of detecting a plurality of bright spots from an image obtained by photographing a universe with a camera;
a calculation function of calculating a first feature quantity showing a positional relationship of at least two or more bright spots among the plurality of bright spots detected by the detection function; and
and a determining function of determining a star corresponding to at least any one of the two or more bright points based on the first feature amount and the second feature amount calculated by the calculating function, the second feature amount including a central angle in an equatorial coordinate system between the stars based on the celestial positional information of the at least two or more stars.
CN202180099733.3A 2021-06-29 2021-06-29 Information processing device, information processing method, and program Pending CN117545693A (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2021/024481 WO2023275970A1 (en) 2021-06-29 2021-06-29 Information processing device, information processing method, and program

Publications (1)

Publication Number Publication Date
CN117545693A true CN117545693A (en) 2024-02-09

Family

ID=84691591

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202180099733.3A Pending CN117545693A (en) 2021-06-29 2021-06-29 Information processing device, information processing method, and program

Country Status (3)

Country Link
JP (1) JPWO2023275970A1 (en)
CN (1) CN117545693A (en)
WO (1) WO2023275970A1 (en)

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013235490A (en) * 2012-05-10 2013-11-21 Nikon Corp Astronomical body identification device, astronomical body identification program, and camera
CN103148852B (en) * 2013-03-01 2015-08-12 国家测绘地理信息局卫星测绘应用中心 A kind of method for recognising star map based on directed loop
EP3182067A1 (en) * 2015-12-18 2017-06-21 Universite De Montpellier Method and apparatus for determining spacecraft attitude by tracking stars
CN108469261A (en) * 2018-02-07 2018-08-31 天津大学 A kind of method for recognising star map suitable for boat-carrying ultra-large vision field celestial navigation system

Also Published As

Publication number Publication date
WO2023275970A1 (en) 2023-01-05
JPWO2023275970A1 (en) 2023-01-05

Similar Documents

Publication Publication Date Title
US11481923B2 (en) Relocalization method and apparatus in camera pose tracking process, device, and storage medium
CN113256712B (en) Positioning method, positioning device, electronic equipment and storage medium
CN108805917B (en) Method, medium, apparatus and computing device for spatial localization
US11625841B2 (en) Localization and tracking method and platform, head-mounted display system, and computer-readable storage medium
CN105283905B (en) Use the robust tracking of Points And lines feature
EP3872764B1 (en) Method and apparatus for constructing map
CN109035334A (en) Determination method and apparatus, storage medium and the electronic device of pose
EP3048555A1 (en) Image processing device, image processing method, and image processing program
CN110706262B (en) Image processing method, device, equipment and storage medium
EP3572910B1 (en) Method, system and computer program for remotely controlling a display device via head gestures
JP6075294B2 (en) Image processing system and image processing method
CN112232315A (en) Text box detection method and device, electronic equipment and computer storage medium
CN110096134B (en) VR handle ray jitter correction method, device, terminal and medium
CN107704106B (en) Attitude positioning method and device and electronic equipment
US9589176B1 (en) Analyzing integral images with respect to HAAR features
CN114066814A (en) Gesture 3D key point detection method of AR device and electronic device
JP2009134677A6 (en) Gesture interface system, gesture input wand, application control method, camera calibration method, and control program
JP2009134677A (en) Gesture interface system, wand for gesture input, application control method, camera calibration method, and control program
CN112085842B (en) Depth value determining method and device, electronic equipment and storage medium
CN117545693A (en) Information processing device, information processing method, and program
CN111951348B (en) Method and device for determining frame selection area and electronic equipment
JP2009302731A (en) Image processing apparatus, image processing program, image processing method, and electronic device
CN115578432A (en) Image processing method, image processing device, electronic equipment and storage medium
CN114299192A (en) Method, device, equipment and medium for positioning and mapping
JP2008072322A (en) Personal digital assistant, acquisition/operation method of contents information, and program of its method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination