ISSN 2070-7401 (Print), ISSN 2411-0280 (Online)
Sovremennye problemy distantsionnogo zondirovaniya Zemli iz kosmosa
CURRENT PROBLEMS IN REMOTE SENSING OF THE EARTH FROM SPACE

  

Sovremennye problemy distantsionnogo zondirovaniya Zemli iz kosmosa, 2022, Vol. 19, No. 3, pp. 105-113

Multiresolution in optical navigation systems of space vehicles

V.A. Grishin 1 
1 Space Research Institute RAS, Moscow, Russia
Accepted: 04.05.2022
DOI: 10.21046/2070-7401-2022-19-3-105-113
The work is devoted to an analysis of the possibilities to use multiresolution in optical navigation systems. Such systems are used to provide information support for spacecraft docking processes in a completely automatic mode. The article notes a certain analogy of solving recognition problems in technical vision systems and in wildlife, in particular, in the human vision system. The main attention is paid to the use of multiresolution mode in solving problems of spacecraft recognition in a wide range of distances, as well as in measuring relative coordinates. Multiresolution can significantly reduce the computational costs of image processing in the recognition process and, most importantly, the amount of reference images needed to solve recognition and relative coordinate measuring problems. This is very important since the preparation of reference images information for docking with uncooperative spacecrafts that are not equipped with special docking targets is a very laborious process. Since the multiresolution directly affects the accuracy of the measurements of relative coordinates, numerical and analytical estimates of the influence of this factor on the measurement accuracy are made. The evaluations confirmed that the deterioration in the accuracy of the estimates at medium and short distances is quite acceptable. At large distances, multiresolution does not degrade measurement accuracy in any way.
Keywords: spacecraft navigation, vision systems, operating range
Full text

References:

  1. Sharonov V. V., Nablyudenie i vidimost’ (Observation and visibility), Moscow: Voennoe izdatel’stvo Ministerstva Oborony Soyuza SSR, 1953, 98 p. (in Russian).
  2. Bay H., Ess A., Tuytelaars T., Gool L. V., Speeded-Up Robust Features (SURF), Computer Vision and Image Understanding, 2008, Vol. 110, pp. 346–359, DOI: 10.1016/j.cviu.2007.09.014.
  3. Benningho H., Boge T., A Novel Navigation and Sensor Strategy for Far, Mid and Close Range Rendezvous to a Cooperative Geostationary Target Spacecraft, Proc. AIAA SPACE Conf. Exposition, Pasadena, CA, USA, 2015, AIAA 2015-4481, DOI: 10.2514/6.2015-4481.
  4. Blarre L., Perrimon N., Moussu C., Da Cunha P., Strandmoe S., ATV Videometer Qualification, Proc. 55 th Intern. Astronautical Congress, Vancouver, Canada, 2004, IAC-04-A.3.07, DOI: 10.2514/6.IAC-04-A.3.07.
  5. Capella LN/LS: 4M Pixel General Purpose CMOS Image Sensor, Back Illuminated, Low Noise, High Full Well Capacity and Digital Output, www.teledyneimaging.com, 18 Jan. 2021, 2 p., available at: https://www.teledyneimaging.com/download/2130fe5a-ff06-4abf-a74b-a35d135399fc.
  6. Granade S. R., Roe F. D., Ground Testing the Hydra AR&D Sensor System, Proc. SPIE Defense and Security Symp., Vol. 6958, Sensors and Systems for Space Applications II, 69580R, Orlando, FL, United States, 2008, DOI: 10.1117/12.777245.
  7. Grishin V. A., Precision Estimation of Camera Position Measurement Based on Docking Marker Observation, Pattern Recognition and Image Analysis, 2010, Vol. 20, No. 3, pp. 341–348, DOI: 10.1134/S1054661810030107.
  8. Gruesser O.-J., Gruesser-Cornehls U., Physiology of vision, In: Fundamentals of Sensory Physiology, Schmidt R. F. (ed.), New York: Springer-Verlag, 1978.
  9. Howard R., Johnston A., Bryan T., Book M., Advanced Video Guidance Sensor (AVGS) development testing, Proc. SPIE Defense and Security Simp., Orlando, FL, USA, 2004, Vol. 5418, 11 p., DOI: 10.1117/12.542475.
  10. Johnson J., Analysis of image forming systems, Proc. Image Intensifier Symp., U. S. Army Research and Development Laboratories, Ft. Belvoir, VA, 1958, pp. 249–273, available at: https://home.cis.rit.edu/~cnspci/references/johnson1958.pdf.
  11. Lee J., Carrington C., Spencer S., Bryan T., Howard R., Johnson J., Next Generation Advanced Video Guidance Sensor: Low Risk Rendezvous and Docking Sensor, Proc. AIAA Space Conf., San Diego, CA, USA, 2008, 11 p., DOI: 10.2514/6.2008-7838.
  12. Lowe D. G., Distinctive Image Features from Scale-Invariant Keypoints, Intern. J. Computer Vision, 2004, Vol. 60, pp. 91–110, DOI: 10.1023/B:VISI.0000029664.99615.94.
  13. Mühlbauer Q., Richter L., Kaiser C., Hofmann P., Robotics Space Systems and Subsystems for Advanced Future Programmes, Proc. Intern. Symp. Artificial Intelligence, Robotics and Automation in Space (i SAIRAS), European Space Agency, Turin, Italy, 2012, 8 p., available at: http://robotics.estec.esa.int/i-SAIRAS/isairas2012/Papers/Session%202B/02B_04_muehlbauer.pdf.
  14. Mühlbauer Q., Rank P., Kaiser C., On-Ground Verification of VIBANASS (Vision Based Navigation Sensor System): Capabilities and Results, 12 th Symp. Advanced Space Technologies in Robotics and Automation, ESA/ESTEC, Noordwijk, 2013, 26 p., available at: http://robotics.estec.esa.int/ASTRA/Astra2013/Presentations/Muehlbauer_2811222.pdf.