LISA Instrument issueshttps://gitlab.in2p3.fr/lisa-simulation/instrument/-/issues2023-09-27T16:33:47+02:00https://gitlab.in2p3.fr/lisa-simulation/instrument/-/issues/135Implement realistic noise characteristics for the MOC time correlation2023-09-27T16:33:47+02:00Jan Niklas ReinhardtImplement realistic noise characteristics for the MOC time correlationThe MOC time correlation is a measurement of the SCET deviation from TCB.
The accuracy of this measurement is required to be better than 0.1ms.
Currently the noise of this measurement is simulated white, which might not be realistic.
1....The MOC time correlation is a measurement of the SCET deviation from TCB.
The accuracy of this measurement is required to be better than 0.1ms.
Currently the noise of this measurement is simulated white, which might not be realistic.
1. Assuming the on-board equipment is suitable designed the limiting factor of the MOC time correlation is the uncertainty of the one way light travel time SC to ground station (determined from the orbit determination). As a first step towards more realistic noise we could therefore use the ASD of the along-track component of the orbit determination (see LISA Ground Tracking).
2. We obtained files for the MOC time correlation for GAIA. It should be checked whether we can deduce noise characteristics from them, and whether they can be transferred to the LISA case.Jan Niklas ReinhardtJan Niklas Reinhardthttps://gitlab.in2p3.fr/lisa-simulation/instrument/-/issues/106Investigate possibility of time chunking2022-10-14T16:53:04+02:00Jean-Baptiste Baylej2b.bayle@gmail.comInvestigate possibility of time chunkingLISA Instrument currently is not suited for long simulations (more than a few months) because of memory pressure. However, we need to run year-long simulations, and therefore need to investigate solutions. One of these solutions is to us...LISA Instrument currently is not suited for long simulations (more than a few months) because of memory pressure. However, we need to run year-long simulations, and therefore need to investigate solutions. One of these solutions is to use time chunking and memory optimizations (releasing memory once it's not needed by downstream processed), and Dask is a well-known framework that can help. A nice side effect is (hopefully) a speed up because of parallel execution (Dask is basically a task scheduler) and the compatibility with most of computing clusters.
We should investigate the use of Dask, i.e.
* Investigate potential show stoppers
* Evaluate what needs to be changed and developed specifically for Dask
* Have a demonstrator that it can work (not to waste resources)
* Make sure that we have the resources (maybe see if we can have students on this), divide up the work and do it
* Test the result, and evaluate the performance
Using Dask probably means that we have to move away from the `ForEachObject` API, and use plain Dask arrays (and Numpy arrays under the hood, but the high-level interface is the same). This has been discussed at length (and investigated) by #34 and !46. We should use various elements of what's been discussed and implemented there.
@mastaa @ohartwighttps://gitlab.in2p3.fr/lisa-simulation/instrument/-/issues/79Handle decomposing into offsets and fluctuations more consistently2023-09-29T10:32:22+02:00Martin StaabHandle decomposing into offsets and fluctuations more consistentlyAt the moment we split up the calculation/propagation of some of the quantities (e.g. laser beams or clock noise) into an offset and an fluctuation part. The important properties of them are:
* offsets: slow variations (out-of-band), us...At the moment we split up the calculation/propagation of some of the quantities (e.g. laser beams or clock noise) into an offset and an fluctuation part. The important properties of them are:
* offsets: slow variations (out-of-band), usually large variable
* fluctuations: in-band signal, usually small -> enables Taylor expansion/simplification of some terms
We should probably think about doing that more consistently throughout the simulator. Examples are some of the noise processes that have spectral shape that diverge at f -> 0Hz. For long simulations those noise time series can blow up in terms of numerical value. This has the disadvantages that we lose numerical precision in-band and some of the assumptions might not hold anymore (which is probably even more critical).
A solution could be to treat (all/some more) quantities in a decomposed form. To achieve that we could also contain the out-of-band part of any noise (not only deterministic drifts) in the offset part. This would also mean that any of the computation performed on the total variable need to implement a side-by-side processing of offsets and fluctuations.
As an example, I tried around with integration (cumulative sum), since this operation is numerically unstable and will cause large drifts when applied to white noise. The operation reads:
```math
\begin{aligned}
y[n] &= y[n-1] + \Delta t x[n] \\
&= y^o[n-1] + y^\epsilon[n-1] + \Delta t x^o[n] + \Delta t x^\epsilon[n] \\
&= y^o[n-1] + (1 - \hat\omega_\mathrm{sat} + \hat\omega_\mathrm{sat})y^\epsilon[n-1] + \Delta t x^o[n] + \Delta t x^\epsilon[n] \\
y^o[n] &= y^o[n-1] + \hat\omega_\mathrm{sat}y^\epsilon[n-1] + \Delta t x^o[n] \\
y^\epsilon[n] &= (1 - \hat\omega_\mathrm{sat})y^\epsilon[n-1] + \Delta t x^\epsilon[n]
\end{aligned}
```
I the last step I wrote down how to generate the integrated output $`y[n]`$ from the input $`x[n]`$ in decomposed form. Here, $`y^\epsilon[n]`$ is a well behaved time series that does not diverge for long times.
![image](/uploads/9c207c321b3290e5d4c11dc7eae6748f/image.png)
You can see that the cutoff (saturation at around 0.1mHz) is reproduced well in the two components: The offset part accounts for all frequencies below $`f_\mathrm{sat}`$ and the fluctuation part for all above. There is some mixing though and we need to investigate if this is of any importance...https://gitlab.in2p3.fr/lisa-simulation/instrument/-/issues/47Use properties to transform `ForEachObject` inputs2023-01-17T22:18:02+01:00Jean-Baptiste Baylej2b.bayle@gmail.comUse properties to transform `ForEachObject` inputsWe currently convert parameters given in the `Instrument`'s init method (can be strings for default or pre-defined values, scalar values when shared between all MOSAs or SC, or a dictionary, or a `ForEachObject` instance) to a `ForEachOb...We currently convert parameters given in the `Instrument`'s init method (can be strings for default or pre-defined values, scalar values when shared between all MOSAs or SC, or a dictionary, or a `ForEachObject` instance) to a `ForEachObject` instance.
This breaks when the user modifies the `Instrument` instance after initialization.
Two solutions:
* We can use properties to make sure that we correctly set those attributes at all times (performing verifications and conversions, maybe even reading and interpolating files).
* Verifications, conversions and other pre-processing steps are performed multiple times if one changes the parameters multiple times before running the simulation.
* On the other side, parameters are valid at all times (we check then when they are set). This means that users don't have to run simulations to see error messages.
* We can prevent users to mutate parameters once the simulation has been run (and prevent potential mismatches between simulation results and parameters in the `Instrument` instance).
* Or delay those steps to simulation time, and only keep raw parametrization until `simulate()` is called.
* Only perform verifications, conversions and others once before simulation.
* Initialization and parametrization are computationally inexpensive. All the heavy lifting is performed in `simulate()`.Jean-Baptiste Baylej2b.bayle@gmail.comJean-Baptiste Baylej2b.bayle@gmail.com