13
ASTROMETRIC MEASUREMENTS OF SATELLITES WITH DYNAMIC VISION SENSOR Michał Żołnowski (1) , Krzysztof Kamiński (2) , Tobias Delbruck (3) , Alicja Gąsiorowska (2) , Justyna Gołębiewska (2) , Marcin Gędek (1) , Monika Kamińska (2) , Joachim Kruger (2)(4) , Mikołaj Krużyński (2) , Mikołaj Pieniowski (1) , Rafał Reszelewski (1) , Krzysztof Romanowski (4) , Edwin Wnuk (2) , Wojciech Dimitrow (2) , Wojciech Mich (4) (1) 6ROADS Sp z o.o., Godebskiego Street 55a, Kraków, Poland, [email protected] (2) Astronomical Observatory Institute, Faculty of Physics, Adam Mickiewicz University, Poznań, Poland, [email protected] (3) Sensors Group, Institute of Neuroinformatics, UZH-ETH Zurich, Winterthurerstr. 190, CH-8057 Zurich, [email protected] (4) ITTI Sp z o.o., Rubież 46, Poznań, Poland ABSTRACT The Dynamic Vision Sensor (DVS) technology, which could potentially be utilized in various SST activities, has been the subject of our research over the last several months. This type of camera records from each pixel an asynchronous stream of events of brightness variation changes, rather than integrated intensity recorded simultaneously over selected exposure time. Due to its novel approach to visual data retrieving and processing, DVS needs evaluation as a potential tool in the SST domain. Several observing methods have been proposed and tested. The key parameters considered in our studies are the camera’s timing accuracy and astrometric precision. The paper presents the results of the world’s first astrometric measurements of satellites observed with a DVS camera performed on a GNSS satellite. Furthermore, a series of observations of principally GEO and MEO, but also LEO, targets was conducted. Satellites were observed with a few different optical telescopes to evaluate DVS camera’s performance in different configurations. A whole observing methodology and reduction pipeline has been developed, including dedicated GNSS clock for event timestamping and dedicated software capable of converting DVS data stream from native AEDAT format to a series of FITS images. 1 INTRODUCTION Dynamic Vision Sensor technology has been suggested as an interesting alternative to CCD and CMOS cameras several years ago (see [1] or [4]). Its primary advantage has been pointed out as a huge dynamic range allowing for daytime detections. Also the principle of operation of DVA cameras seems to fit perfectly for satellite survey observations, with its low latency and sparse low data rate. On the other hand the DVS technology is relatively new, prototypical, uncooled and lacks large high- sensitivity sensors with fully employed capabilities for low light applications. In this project we decided to use a first back-side illumination hybrid DVS and imaging camera prototype, namely DAVIS 346BSI developed by University of Zurich. 2 DVS SPECIFICATIONS The DVS camera, which underwent thorough examination was DAVIS 346BSI. The pixel size of this camera is 18.5μm x 18.5μm, the array size of 6.4mm x 4.8mm, 100% fill factor and the peak quantum efficiency of 93%s. Three copies of the camera were assembled for the tests. Fig.1 presents the DAVIS 346BSI camera. Figure 1. DAVIS 346BSI camera. 3 HARDWARE & SOFTWARE PREPARATION By default, the camera’s firmware timestamps each recorded event with microsecond resolution using its internal oscillator without reference to any external source. The beginning of each recording (event stream) is time-stamped with PC control software (jaerViewer) Proc. 8th European Conference on Space Debris (virtual), Darmstadt, Germany, 20–23 April 2021, published by the ESA Space Debris Office Ed. T. Flohrer, S. Lemmens & F. Schmitz, (http://conference.sdo.esoc.esa.int, May 2021)

ASTROMETRIC MEASUREMENTS OF SATELLITES WITH …

  • Upload
    others

  • View
    1

  • Download
    0

Embed Size (px)

Citation preview

ASTROMETRIC MEASUREMENTS OF SATELLITES WITH

DYNAMIC VISION SENSOR

Michał Żołnowski(1)

, Krzysztof Kamiński(2)

, Tobias Delbruck(3)

, Alicja Gąsiorowska(2)

, Justyna

Gołębiewska(2)

, Marcin Gędek(1)

, Monika Kamińska(2)

, Joachim Kruger(2)(4)

, Mikołaj

Krużyński(2)

, Mikołaj Pieniowski(1)

, Rafał Reszelewski(1)

, Krzysztof Romanowski(4)

, Edwin

Wnuk (2)

, Wojciech Dimitrow(2)

, Wojciech Mich (4)

(1)6ROADS Sp z o.o., Godebskiego Street 55a, Kraków, Poland, [email protected] (2)Astronomical Observatory Institute, Faculty of Physics, Adam Mickiewicz University, Poznań, Poland,

[email protected] (3) Sensors Group, Institute of Neuroinformatics, UZH-ETH Zurich, Winterthurerstr. 190, CH-8057 Zurich,

[email protected] (4)ITTI Sp z o.o., Rubież 46, Poznań, Poland

ABSTRACT

The Dynamic Vision Sensor (DVS) technology, which

could potentially be utilized in various SST activities, has

been the subject of our research over the last several

months. This type of camera records from each pixel an

asynchronous stream of events of brightness variation

changes, rather than integrated intensity recorded

simultaneously over selected exposure time. Due to its

novel approach to visual data retrieving and processing,

DVS needs evaluation as a potential tool in the SST

domain. Several observing methods have been proposed

and tested. The key parameters considered in our studies

are the camera’s timing accuracy and astrometric precision. The paper presents the results of the world’s

first astrometric measurements of satellites observed with

a DVS camera performed on a GNSS satellite.

Furthermore, a series of observations of principally GEO

and MEO, but also LEO, targets was conducted.

Satellites were observed with a few different optical

telescopes to evaluate DVS camera’s performance in

different configurations. A whole observing

methodology and reduction pipeline has been developed,

including dedicated GNSS clock for event timestamping

and dedicated software capable of converting DVS data

stream from native AEDAT format to a series of FITS

images.

1 INTRODUCTION

Dynamic Vision Sensor technology has been suggested

as an interesting alternative to CCD and CMOS cameras

several years ago (see [1] or [4]). Its primary advantage has been pointed out as a huge dynamic range allowing

for daytime detections. Also the principle of operation of

DVA cameras seems to fit perfectly for satellite survey

observations, with its low latency and sparse low data

rate. On the other hand the DVS technology is relatively

new, prototypical, uncooled and lacks large high-

sensitivity sensors with fully employed capabilities for

low light applications. In this project we decided to use a

first back-side illumination hybrid DVS and imaging camera prototype, namely DAVIS 346BSI developed by

University of Zurich.

2 DVS SPECIFICATIONS

The DVS camera, which underwent thorough

examination was DAVIS 346BSI. The pixel size of this

camera is 18.5μm x 18.5μm, the array size of 6.4mm x

4.8mm, 100% fill factor and the peak quantum efficiency

of 93%s. Three copies of the camera were assembled for

the tests. Fig.1 presents the DAVIS 346BSI camera.

Figure 1. DAVIS 346BSI camera.

3 HARDWARE & SOFTWARE

PREPARATION

By default, the camera’s firmware timestamps each

recorded event with microsecond resolution using its

internal oscillator without reference to any external

source. The beginning of each recording (event stream)

is time-stamped with PC control software (jaerViewer)

Proc. 8th European Conference on Space Debris (virtual), Darmstadt, Germany, 20–23 April 2021, published by the ESA Space Debris Office

Ed. T. Flohrer, S. Lemmens & F. Schmitz, (http://conference.sdo.esoc.esa.int, May 2021)

which can be referenced to UTC via NTP with a typical

accuracy at the level of 10 milliseconds. LEO tracking

requires accuracy at the level of sub-milliseconds and

millisecond accuracy is required for higher orbits. In

order to meet those accuracy requirements and to provide

the time reference independent from Internet connection

delays, external time impulses are injected into the DAVIS 346BSI camera in periodic manner. A dedicated

GNSS based clock has been constructed to generate these

impulses with microsecond accuracy. They are recorded

by DAVIS 346BSI camera as so called “special” events,

which are time-stamped in the same way as normal

events, however they can be easily distinguished. Two

time measurements of the same impulse, one by GNSS

clock, one by DAVIS 346BSI camera timestamp enable

the conversion of the camera’s internal clock to the UTC

time scale. Fig.2 presents the scheme of DAVIS 346BSI

camera events timing system.

Figure 2. DAVIS 346BSI camera events timing system.

Subsequently, it was necessary to develop software,

which would enable the effective use of the data provided

by the DAVIS 346BSI cameras with a regular

astronomical reduction and analysis pipelines. The software – AEDAT Converter – performs two tasks

simultaneously. First – it transforms event data stream

from AEDAT format to FITS images by counting all

events in each pixel between selected start and stop time

and saving them as integer ADUs. User is able to select

single or multiple time segments and event types to be

saved. The second functionality is to linearly interpolate

time passage between recorded “special” events and

convert internal DVS timestamps into UTC for each

event individually. Overall this software allows to select

into how many images each recording is converted and

therefore it controls the timing resolution of resulting

image series up to 1 microsecond, which is the limit of

DVS timestamps precision. Fig. 3 presents the scheme of

the process of converting timestamps to UTC.

Figure 3. Conversion of timestamps to UTC

4 LABORATORY ANALYSIS

A series of laboratory test with artificial, moving light

sources were conducted prior to observations in order to

determine a set of optimum DVS camera settings. Two

teams using different approaches found a slightly

different combination of settings that were later used

during observations. The results are presented in Tab.1

Parameter Default

settings

6Roads

Night

6Roads

Daylight

AO

AMU

Night

AO

AMU

Daylight

DiffBn 5.1n 115.5n 31.7n 51.7n 51.7n

OnBn 194.5n 317.6n 95.7n 244.6n 486.3n

OffBn 124.6p 0 0 97.9p 49.3p

PrBp 7.9p 6.4p 6.0p 7.9p 7.9p

PrSFBp 8.0p 4.6p 4.3p 8.0p 8.0p

RefrBp 593.6p 5.6n 5.1n 593.6p 593.6p

Table 1. DAVIS 346BSI camera settings used during the

night-time and daytime observations.

In order to better understand DVS camera behaviour in

low light observations a simple test was conducted to

measure the level of delays of two simultaneous light-

source blinks. Using our DAVIS 346BSI camera

equipped with a GNSS clock we recorded blinks of a LED, controlled with another GPS receiver in order to

compare the actual time of LED light up with the

recorded time. Additionally, by observing also a LED

reflection from a piece of glass we had the opportunity to

compare reaction time at about 25x lower illumination,

as illustrated in Fig. 4.

Figure 4. Testing setup for timing of a blinking LED

experiment.

Figure 5. DAVIS 346BSI camera recording of a blinking

LED and its reflection at three time intervals. In the top

image no events are recorded even though a new second

has started. In the middle image first events from

blighter light source is detected, but no events from its

reflection. In the bottom image the dimmer light source

blink is finally detected.

DVS setup Bright source

(direct LED)

Dim source

(reflected LED)

Nighttime 0.0029s ±

0.0006s 0.015s ± 0.003s

Updated nighttime

On Bias 1.32x

higher

0.0046s ±

0.0014s 0.020s ± 0.005s

Daytime

On Bias 2.27x

lower

0.0034s ±

0.0006s 0.017s ± 0.004s

Table 2. DAVIS 346BSI delays in recording blinking

LED and its 25x dimmer reflection.

As presented in Fig. 5 and Tab. 2 we detected measurable

delays in all recordings, but the 25x dimmer light-source

had a significantly larger delay. This result shows that

DVS camera time bias is mostly negligible in case of

GEO satellites observations, but significant in low orbit

targets. It also shows that DVS camera can have different

time bias for simultaneously observed objects if their

brightness differs significantly, which can be difficult to

correct for during analysis.

5 DVS OBSERVING METHODOLOGY

In both satellite survey and tracking observations, the

DAVIS 346BSIcamera struggles to detect stationary

objects with 0.3m telescope — unless their magnitude is

brighter than about 8. New observing techniques were

developed in order to overcome this limitation.

5.1 Rotating shutter

The fundamental rule of the observations performed with

the use of an event based sensor is the need for changes

in the field of view. Tracked objects, such as stars or

satellites do trigger events only because of scintillation

or tracking errors, thus they have low SNR when

compared to objects that move.

The initial solution devised by the project’s members was

employing a regular astronomical focuser in order to

make stellar images out of focus and then rapidly focus

them. Rapid refocusing would create an effect similar to reappearance of all objects in the field of view. However,

this method proved to be inefficient.

Further tests consisted of quick covering and uncovering

the end of the scope, which was aimed towards the sky.

The dramatic increase in flux from zero, when the scope

was covered, to full aperture when it was opened,

triggered a clearly visible cascade of star-caused events.

Eventually, a commercial, 5-positional colour filter

wheel produced by ASI ZWO was modified to

periodically cover and evenly uncover the DAVIS’ chip.

Subsequently, it was installed between the camera and

the telescope.

Figure 6. The modified CFW prior to attachment to the

camera with modified blades of the shutter.

The filter wheel was modified to spin continuously at a

user-selectable rate using an Arduino board. During

observations, the filter acts as a rotating shutter, similarly

to the shutters used with meteor cameras. Although the

initial results of using this method were promising, it was

noted that better darkening of the shutter’s interior was

necessary as stray light worsened the overall performance

significantly. Therefore, entire interior of the wheel’s

chamber as well as the connector were covered with

black material, resulting in improved results.

In further stages of the evaluation of the rotating shutter

method it was revealed that it improves limiting

magnitude for stationary objects to a large extent,

however, not exactly to the same level as telescope

movement. The test was performed when the Moon was

full, thus the bright background could have had a

negative impact on the results, which is worth noting.

Figure 7. Observations of Alcyone star region in

Pleiades with shutter method. The events triggered just

after the blade of the shutter-like wheel has uncovered

the DAVIS surface. The brightness of stars: (a) 2.85m,

(b) 8.52 m, (c) 6.29 m, (d) 8.97 m, (e) 8.71 m.

Figure 8. Observations of Alcyone star region in

Pleiades during slewing of the mount with angular

velocity of 240 arcsec/sec. Visible waves of ON and

afterward OFF events generated by the movement

toward the lower-right corner. The brightness of stars:

(a) 2.85m, (b) 8.52 m, (c) 6.29 m, (d) 8.97 m, (e) 8.71 m.

5.2 Epicycle tracking

Typically, satellite tracking is based directly on its

ephemeris. When the ephemeris is sufficiently accurate

and the telescope equatorial mount is accurate enough,

the target satellite is kept constantly at the same camera

pixels. In tracking with DVS camera this significantly

reduced (by about 3-4 mag) the limiting magnitude and

thus a different approach has been developed.

The satellite ephemeris was augmented with a circular

motion, with the period of 6 seconds and the radius

ranging from 6 to 15 arcseconds. As a result, the telescope was constantly wobbling around the ephemeris

position of the satellite and satellite image was moving in

the field of view with constant velocity. By manipulating

the radius and period it was possible to determine at what

angular speed the satellite was going to be moving during

tracking. It was found that the best results can be obtained

when the radius is 6 arcsec and the angular speed of the

satellite is about 6 arcsec/sec.

The drawback to this method is the fact that it introduces

complex, wave-like trails for reference stars, instead of

simple, linearly elongated images. However, this issue can be solved by using a relatively long rotation period

dividing the data into time steps small enough for the

curvature of the stellar trails to become undetectable.

Moreover, the AO AMU’s astrometric software is

capable of processing targets of any shape.

Figure 9. An example of “epicycle tracking” with a

radius of 15 arcsec. This image shows “on” events

registered while tracking GPS satellite no 45854. The

telescope is tracking in such a way that the satellite is

making small circles (slightly deformed because of

cos(Dec) and telescope interia). A single star passing

through the field of view is also visible.

Figure 10. A comparison of the same satellite

observations made at the same time with DAVIS 346BSI

camera using different tracking modes. All images

represent On events recorded during 1 sec. In the top

image a satellite survey observation with a telescope in

sidereal tracking mode is presented. A GPS satellite

traveling at about 45 arcsec/sec is clearly visible. In the middle image a satellite tracking observation is

presented. The satellite is only barely detectable and

only in some images (one of the best is presented here).

In the bottom image an “epicycle tracking” mode is

presented. The satellite is clearly visible, at the level

comparable with survey mode. Also a field star is

recorded.

6 OBSERVATIONAL TESTS

6.1 GPS satellite tracking

In order to comprehensively compare DVS and sCMOS

camera performance in satellite tracking a dedicated

observation campaign was conducted between January

and March 2021. The observations were conducted using

two identical 12.5 inch f/8 telescopes. One of the

telescopes was equipped with a DAVIS 346BSI camera.

The other telescope was equipped with Andor Zyla 5.5 camera. Both cameras were connected with dedicated

GNSS clocks for accurate image timing.

The software used for astrometry required at least 4

reference stars for a reliable solution. However, due to

the narrow field of view of the DAVIS 346BSI camera as

well as the satellite magnitudes rarely high enough for

detection even when moving in the field of view, it was

nearly impossible to meet this requirement using regular

satellite tracking. As a result, the two novel techniques

developed by 6ROADS and AO AMU, described in

Section 5, were employed, as well as numerous small

incremental improvements in reduction and analysis

software.

On the 3rd of March 2021, the most successful

observations were conducted. The observed target was a

GPS satellite (NORAD 40294, COSPAR 2014-068A).

The telescope equipped with the DAVIS 346BSI camera

collected events from 22:33:45 to 22:54:57 UTC. The

data stream was converted to 2472 FITS images, each

containing the “on” events solely and covering the

timespan of 0.5s.

Figure 11. An example of GPS satellite tracking

observation with DAVIS 346BSI camera and CDK12.5

f/8 telescope. The image is composed of “on” events

recorded during 0.5 seconds. The satellite is not clearly

visible here.

Figure 12. An example of GPS (40294) satellite tracking

observation with DAVIS 346BSI camera and CDK12.5

f/8 telescope. The image is composed of “on” events recorded during 6.0 seconds. This is presented only to

show where the satellite was observed (inside the red

circle). The telescope epicycle tracking with the satellite

constantly making small (R=6 arcsec) circles is visible.

The converted data was then analysed with Poznań

Satellite Software Tools. Astrometric data were

automatically extracted from 8 FITS files obtained from

DAVIS 346BSI and from 992 FITS files obtained from

Andor Zyla. The difference reflects the significantly

smaller field of view of the DAVIS 346BSI camera and

approximately 2 mag worse limiting magnitude. Rarely

more than 3 stars with SNR higher than 2 were observed simultaneously for the software to identify the observed

field. Fig.14 presents the comparison of selected

astrometric positions derived using DAVIS 346BSI and

Zyla cameras.

Figure 13. Comparison of selected astrometric position

of GPS (40294) satellite derived on 08-03-2021 using

DAVIS 346BSI and Zyla cameras.

All astrometric positions derived with both cameras were

compared with high accuracy SP3 ephemeris provided by

the GNSS operator. In order to derive the both regular

and random shifts in RA and Dec, the differences were

analysed statistically. Moreover, the difference along and

across satellite orbits was calculated in order to have a

better understanding of the nature of observed

inaccuracies. The systematic differences along orbit were

associated primarily with image or event timing

systematic errors and the random differences across orbit

were associated with overall astrometric accuracy. Large

systematic differences across the satellite's orbit should

not be observed unless there is an error in data reduction or in ephemeris. Small differences might be present due

to, for example, noncentral target placement in images or

systematic asymmetries in stellar images due to imperfect

collimation, filed rotation etc.

Figure 14. Distribution of errors in RA and Dec for

GPS (40294) satellite observed with DAVIS 346BSI

camera.

Figure 15. Distribution of errors along and across the

trajectory of a GPS (40294) satellite observed with

DAVIS 346BSI camera.

As seen in Fig.15, the data obtained using DAVIS

contains small but systematic deviations from ephemeris

positions. In the direction across satellite trajectory (Fig. 16) it can be noticed that the astrometric uncertainty is at

the level of 0.6 arcsec, which can be considered a very

good result. Moreover, no significant systematic

deviation across orbit can be noticed. All of the

systematic errors are oriented along the trajectory of the

satellite, which indicates that they are most likely

connected with event timing errors. Almost all of the

measured points are concentrated around -125 ms time

bias. With the satellite's angular velocity of about 40

arcsec/sec this corresponds to a shift of 5 arcsec or about

3 pixels in DAVIS camera. Several potential primary

explanations of this shift have been considered.

According to the first one, the shift could be attributed to

the slow reaction time of the DAVIS camera for very dim

targets, such as the observed satellite. This effect was

measured before in laboratory experiments and expected

to produce a time shift at the level of about 20ms for

target that were brighter than the satellite. Since the

camera reaction time drops with target brightness it can

be suspected that in case of nearly undetectable light

source the delay will be significantly larger. This effect

affects also all reference stars and creates flux dependent

distortions in relative X,Y positions of stars and satellites.

Another factor, which could potentially have an impact

on the shift is the asymmetry of the images of reference

stars. Each of the recorded moving object contains a

clearly visible “comet tail”, produced by pixels, which

trigger “on” events with significantly longer than usual delays. It seems that pixel triggering has the effect of

increasing the probability of producing random events of

the same polarity immediately afterwards. The

asymmetry was partially mitigated by including higher

weights to pixels with more events but we suspect that

some residuals might still influence the results. It is hard

to estimate the order of magnitude of this effect, however

if we assume that it can shift image by 1 to 2 pixels than

it is slightly too low to explain the -125ms time bias. The

final hypothesis was that the DAVIS 346BSI camera has

an intrinsic time bias, which is considered probable as

most cameras used in satellite tracking exhibit time bias

at the level of tens or hundreds of milliseconds. It was noticed multiple times that the camera control software

displayed a warning saying that “a non monotonic even

has been detected” continuously throughout the

observations. This could indicate that there were certain

timing issues related to the camera’s firmware and

hardware.

Similar analysis was performed for the images of the

same satellite obtained simultaneously using the Andor

Zyla camera on an identical telescope. Its result shows a

time bias of -18 ms and astrometric uncertainty at the

level of 0.15 arcsec (see Fig. 17 and 18).

Figure 16. Distribution of errors in RA and Dec for

GPS satellite observed with Andor Zyla 5.5 camera.

Figure 17. Distribution of errors in RA and Dec for

GPS satellite observed with Andor Zyla 5.5 camera.

Figure. 18 Top-left: Image composed of “on” events and cross-section line of star trail recorded with the DAVIS

346BSI camera. Top-right: distribution of “on” events along the cross section line. A trail of “on” events is visible to

the right. Bottom: Image and cross section of star trail recorded with the Zyla camera

Overall the results for DAVIS 346BSI camera have

acceptable and repeatable time bias and astrometric

accuracy on par with our expectations given that Andor

Zyla had significantly better limiting magnitude and

more reference stars.

In order to further confirm the results of the observations,

the orbital elements of the observed GPS satellite were

determined using data from both cameras. The software

used for this process was the GEODYN 2 package. The

number of astrometric measurements with Andor Zyla

was about 100 times larger than with DAVIS 346BSI

camera. As all data points in GEODYN 2 have the same

weights, the influence of DAVIS data points on the

resulting orbit is very low. Due to this fact, the results

should be interpreted only as a relative comparison of

DAVIS 346BSI measurements with respect to Zyla.

Figure 19. Orbital fit residuals in RA.

Figure 20. Orbital fit residuals in Dec

In Fig. 20 and 21, the blue dots representing the

measurements from DAVIS 346BSI are deviated by a

few arcsec because of its larger time bias. We can also

see larger scatter because of its lower astrometric

uncertainty resulting primarily from lower number of

reference stars.

6.2 Ajisai satellite

Ajisai is an experimental satellite on low Earth orbit

designed specifically for optical monitoring with ground-

based laser stations and telescopes (NORAD - 16908,

COSPAR - 1986-061A). It is a rotating sphere composed

mostly of mirrors and retroreflectors. The satellite was

clearly visible to the DVS during the specular reflections

of sunlight.

The Ajisai satellite was observed with DAVIS 346BSI

camera on a 12.5 inch f/8 telescope. With this setup, the

satellite was observed 2 times in total — on the 20th of

November 2020 for 120 seconds and the 13th of February 2021 for 125 seconds. The third observation of the Ajisai

satellite was performed using a 12” Newton telescope

and lasted 44 seconds. The event streams from the

observations were converted into FITS images using 10

millisecond time steps in order to obtain high time

resolution for this object. The primary objective was to

determine the brightness variation period.

Aperture photometry was used for each FITS image to

measure the “on” event rate. It was at the level of 0 most

of the time, however during blinks of the satellite, the rate

was increasing significantly. The event rate changes over the period of 2 minutes of observations are presented in

Fig. 22 and the flux rate changes of “on” events are

presented in Fig. 23

Figure 21. Event rate changes of "on" events from

Ajisai satellite during 2 minutes of observation on the

20th of November 2020.

Figure 22. Flux rate changes of "on" events from Ajisai

satellite during 3 seconds of observation on the 20th of

November 2020.

As seen in our data, the blinks are clearly visible and they

occur in a repeating pattern corresponding to the location

of mirrors on the surface of the rotating satellite body.

The data was analysed using Fourier periodogram with

the frequency of up to 2Hz.

Figure 23. Fourier periodogram up to frequency of 172800 c/d (2.0 Hz) for observations on the 20th of

November 2020.

The regularly distributed peaks visible in the Fig. 24 are

the harmonics of the main frequency. After further

inspection of event curves phased with the most

prominent frequencies we decided that the actual period

was P = 34885 ± 20.7 c/d as it presented the lowest scatter

of data points.

Similar analysis was carried out with subsequent

observations. Its result is presented in Tab. 3. We also

compared our determinations with long term ephemeris

published in [3]. Fig. 25 shows a perfect agreement

between expectation and our determination proving that

DVS camera can be a valuable tool in period

determination of fast rotating satellites.

UTC date observed

spin period

[s]

uncertainty

[s]

years

after

launch

ephemeris spin

period [s]

2020-11-20 2.4788s 0.0014s 34.275 2.4785

2021-02-13 2.4878s 0.0019s 34.508 2.4871

2021-02-15 2.4870s 0.0021s 34.513 2.4872

Table 3. Ajisai rotation period determined.

Figure 24. Three periods of the Ajisai satellite

determined during this project showing a clear sign of

growth, as expected from the ephemeris (dashed line)

from [2].

7 CONCLUSIONS

The possibilities of present day DVS cameras, as

concluded from testing the DAVIS 346BSI, in terms of ground-based observations of Earth satellites are

currently limited to some extent in several areas. As

opposed to modern CCD and CMOS cameras, the sizes

of available detectors are very small. Currently there are

not many specimens of the camera available off-the-

shelf, so it is not easy to find a camera with a matching

pixel size for a particular optical system. Moreover, due

to the fact that the DVS cameras are not cooled, the low-

light sensitivity is limited.

As a result of performing various tests on DAVIS 346BSI

camera, certain prerequisites have emerged, the fulfilment of which is necessary for performing survey

and tracking observations with the DVS. Firstly, the DVS

camera timing has to be related to UTC time scale. It is

possible through feeding the camera every few seconds

with an electronic rectangular impulse generated with

GNSS based clock. Thanks to this solution, the

conversion of internal timestamps to UTC is possible.

The second prerequisite is the conversion of the data

format generated by DVS cameras (AEDAT) into FITS

file format, so that it can be analysed with regular

astronomical photometric or astrometric software. For

this purpose, the AEDAT Converter software was

developed within the project.

Once the aforementioned requirements are fulfilled, it

was possible to perform several tests which shown that

the limiting magnitude of the DVS proved to be worse by

about 1 mag with respect to modern sCMOS sensors used

at the same telescope at angular velocity > 0.01°/s. At

velocities smaller than 0.01 deg/sec the difference

increases to about 3 mag. Based on the tests, the time bias

of the DAVIS 346BSI camera has the value at least

several milliseconds for bright targets and several dozen

milliseconds for dim targets. Moreover, the results of

GNSS observations indicated that the time bias can be

even larger, reaching up to about 125 milliseconds.

Moreover each recorded moving target produces a

“comet tail” which makes the image analysis more

challenging.

It was also noted that only objects, which are brighter

than about 8 mag are visible during tracking

observations, while targets down to about 12.5 mag are

detectable at angular velocities between 0.001°/s and

0.01°/s. Specific software or hardware modifications can

mitigate this problem, as the rotating shutter and variable

telescope tracking methods provided satisfying results.

Another important finding is the fact that when the

DAVIS 346BSI camera is combined with a typical

astronomical telescope, astrometry is impossible in most

of the cases due to lack of reference stars, which is caused

mainly by the small sensor size and reduced limiting

magnitude. The daytime limiting magnitude of the

DAVIS 346BSI camera is about 2-3 mag, which is worse

than in the case of most of the modern CMOS cameras.

This makes the daytime survey or tracking observations

with the DAVIS 346BSI camera (with the exception of

several brightest satellites) practically impossible.

The satellite tracking is additionally straitened by the

dependence of the DAVIS 346BSI camera time bias on

the target brightness. In the case of this type of

observations, the reference stars move in the field of view

with substantial angular velocities and each of them has

a different delay in DVS recording depending on its

magnitude. This may cause a deformation in relative

stellar positions that would have to be corrected.

Therefore, when a fast-moving object is observed, an

additional step in astrometric analysis is necessary for

DVS sensors when compared to regular CCD or CMOS

cameras. However, this effect is probably significant only

in the case of tracking the fastest MEO and LEO targets.

What is worth emphasizing is the fact that the limiting

magnitude of the DAVIS 346BSI camera is limited by

only 1-1.5 mag with respect to modern CMOS cameras

in satellite survey observations. It is possible that with the

future low-noise cameras, the on-the-fly detection and

analysis of objects passing in the field of view will be

achievable thanks to the DVS camera low bandwidth.

The time bias in astrometric measurements of GNSS

satellite performed with the DVS camera, was at the level

of ~125ms which is higher than ~4ms reported in literature [2] and ~20ms in our blinking LED

experiments.

The overall conclusion is that the current generation of

DVS cameras is a demanding tool, still at an early stage

of development considering its potential astronomical

use. The cameras were not developed to work under a

very low light condition. Perhaps with the exception of

Ajisai satellite rotation period determination,

observations with DAVIS 346BSI camera had no

advantage over a CMOS camera. In order to improve

DVS cameras’ results and poor astrometric success rate

either a much larger sensor size or much faster optical

systems would be necessary. Several improvements were

deemed beneficial for the DVS to be used in the future in its most promising area — satellite survey observations.

Those include using fast optical systems; implementing a

cooling module, thus significantly reducing the noise

events generation; lowering the detection threshold from

10% to at least 3%, introducing improvements to the

event stream conversion to FITS images or direct

analysis of AEDAT stream and finally reducing the pixel

size to match the camera properly with fast optical

systems.

Regardless of the rather early stage of technological

readiness of the DAVIS 346BSI camera, it is clear to see

the applications where this technology is worth consideration. DVS technology might be useful in

adaptive optics as a wavefront sensors camera. Another

interesting prospect for the use of the DVS technology in

the future is recording the lunar flashes, which are the

rapid changes in the brightness of the Moon’s surface

caused by meteorite strikes.

Considering the presented method of DVS data stream

conversion to 2D images (Fig. 26) it is obvious that we

do not utilize its full survey potential. In the case of

survey observations it would be beneficial to combine

events into images in the direction parallel to the target’s motion vector (Fig. 27). This method is equivalent to a

shift-and-add (synthetic tracking) method used, among

others, in very fast NEO observations. However, in

contrast to the shift-and-add, with the DVS cameras it is

possible to work with individual pixels and better

separate them from objects and background noise.

Obviously this method would have to be complemented

with a searching algorithm that would identify the

passage and calculate its direction.

Figure 25. Standard method of converting the DAVIS event stream to a 2D image. Events are selected

between two chosen moments of time and are

subsequently stacked onto and XY plane parallel to the

time axis.

Figure 26. Adaptive event stacking in which events are

combined into images along the target trail. A much

higher signal to noise ratio is possible since all events corresponding to a single object are grouped together in

just a few pixels.

8 REFERENCES

1. Cohen, G., Afshar, S., Schaik, A. et al (2017)

“Event-based sensing for space situational

awareness”, Proceedings Of The 18Th

Advanced Maui Optical And Space Surveillance

Technologies Conference, 19-22 September

2017.

2. Delbruck, Hu, He (2020) “V2E: From video

frames to realistic DVS event camera streams”,

arXiv e-prints.

3. Kudak V. (2017) “Studying of the own rotation

period changes of satellite "Ajisai" on the

interval 1986-2017”, Scientific Herald of

Uzhhorod University Series Physics.

4. Żołnowski M., Delbruck T., Kamiński K. et al. (2019) “Observational evaluation of event

cameras performance in optical space

surveillance”, ESA NEO SST Conference.