# BL2009

In this example, we analyse the simulated datasets created by Balan & Lahav (2009), hereafter BL2009. There are two datasets, with one and two simulated planets, and which were used to test the ExoFit package.

## Simulated data

Let's take a quick look at the datasets.

```
files = ['BL2009_dataset1.kms.rv', 'BL2009_dataset2.kms.rv']
titles = ['1 planet', '2 planets']
fig, axs = plt.subplots(1, 2, sharey=True, sharex=True, figsize=(12, 8))
for i, file in enumerate(files):
axs[i].errorbar(*np.loadtxt(file).T, fmt='ko', ms=2)
axs[i].set(xlabel='Time [days]', ylabel='RV [km/s]', title=titles[i])
```

The two simulated Keplerian signals are already clearly visible.

## Defining and running the model

The `kima_setup`

files set the main options for the model, which is as similar as possible to the one used by BL2009.
We use a standard sum-of-Keplerians model (no GP) and no linear trend.
The number of planets in the model can be fixed to 1 or 2 depending on the
dataset. Here we will show the results for one planet.

Inside the `RVmodel()`

constructor, we define the priors for the model
parameters to be the same as those used by BL2009 -- see their
Table 1.

```
Cprior = make_prior<Uniform>(-2000, 2000); // m/s
Jprior = make_prior<ModifiedLogUniform>(1.0, 2000.); // m/s
auto conditional = planets.get_conditional_prior();
conditional->Pprior = make_prior<LogUniform>(0.2, 15E3); // days
conditional->Kprior = make_prior<ModifiedLogUniform>(1.0, 2E3); // m/s
conditional->eprior = make_prior<Uniform>(0., 1.);
conditional->phiprior = make_prior<Uniform>(0.0, 2*M_PI);
conditional->wprior = make_prior<Uniform>(0.0, 2*M_PI);
```

```
# priors from pykima.priors or from scipy.stats can be used interchangibly
from pykima.priors import ModifiedLogUniform
from scipy.stats import loguniform, uniform
model.priors.C = uniform(-2000, 2000) # m/s
model.priors.J = ModifiedLogUniform(1.0, 2000) # m/s
model.priors.P = loguniform(0.2, 15e3) # days
model.priors.K = ModifiedLogUniform(1.0, 2000) # m/s
model.priors.e = uniform(0.0, 1.0)
model.priors.phi = uniform(0, '2*PI')
model.priors.w = uniform(0, '2*PI')
```

Let's use the `kima-run`

script to run the model.

With the default options, this example takes a couple of minutes to finish.

```
compiling... done! (in 0.0 sec)
starting [1mkima[0m
# Loaded 34 data points from file BL2009_dataset1.kms.rv
# Multiplied all RVs by 1000; units are now m/s.
# Using 4 threads.
# Target compression factor between levels = 2.7182818284590451
# Seeding random number generators. First seed = 1666013588.
# Generating 8 particles from the prior...done.
# Creating level 1 with log likelihood = -891.122642053.
# Creating level 2 with log likelihood = -286.661345353.
# Creating level 3 with log likelihood = -253.513259938.
# Creating level 4 with log likelihood = -227.751852197.
# Creating level 5 with log likelihood = -208.451018596.
# Creating level 6 with log likelihood = -195.228397785.
# Creating level 7 with log likelihood = -185.950386451.
# Creating level 8 with log likelihood = -180.488580257.
# Creating level 9 with log likelihood = -178.011258759.
# Creating level 10 with log likelihood = -177.112052933.
# Saving particle to disk. N = 50.
# Creating level 11 with log likelihood = -176.663097391.
# Creating level 12 with log likelihood = -175.651826656.
# Creating level 13 with log likelihood = -173.134802829.
# Creating level 14 with log likelihood = -170.158191476.
# Creating level 15 with log likelihood = -167.372555099.
# Creating level 16 with log likelihood = -163.550209729.
# Creating level 17 with log likelihood = -158.428673742.
# Saving particle to disk. N = 100.
# Creating level 18 with log likelihood = -151.064008807.
# Creating level 19 with log likelihood = -146.687940483.
# Creating level 20 with log likelihood = -141.32745059.
# Creating level 21 with log likelihood = -136.415132047.
# Creating level 22 with log likelihood = -130.685504493.
# Creating level 23 with log likelihood = -126.089838518.
# Saving particle to disk. N = 150.
# Creating level 24 with log likelihood = -122.150628834.
# Creating level 25 with log likelihood = -118.111186811.
# Creating level 26 with log likelihood = -114.23818351.
# Creating level 27 with log likelihood = -107.385989866.
# Saving particle to disk. N = 200.
# Creating level 28 with log likelihood = -104.81818415.
# Creating level 29 with log likelihood = -102.536153057.
# Creating level 30 with log likelihood = -101.191957333.
# Saving particle to disk. N = 250.
# Creating level 31 with log likelihood = -99.8738829371.
# Creating level 32 with log likelihood = -97.9408195493.
# Creating level 33 with log likelihood = -96.5847833624.
# Creating level 34 with log likelihood = -93.2164715797.
# Saving particle to disk. N = 300.
# Creating level 35 with log likelihood = -90.5763148385.
# Creating level 36 with log likelihood = -86.3863763453.
# Creating level 37 with log likelihood = -82.2523443743.
# Creating level 38 with log likelihood = -80.0106103198.
# Creating level 39 with log likelihood = -77.6659632352.
# Saving particle to disk. N = 350.
# Creating level 40 with log likelihood = -77.0490085212.
# Creating level 41 with log likelihood = -75.2083501598.
# Creating level 42 with log likelihood = -73.3882200208.
# Saving particle to disk. N = 400.
# Creating level 43 with log likelihood = -71.5826596616.
# Saving particle to disk. N = 450.
# Creating level 44 with log likelihood = -70.3151006406.
# Creating level 45 with log likelihood = -69.4421111417.
# Saving particle to disk. N = 500.
# Creating level 46 with log likelihood = -68.9703654728.
# Creating level 47 with log likelihood = -68.5923853132.
# Saving particle to disk. N = 550.
# Creating level 48 with log likelihood = -68.1931468208.
# Creating level 49 with log likelihood = -67.9873688991.
# Creating level 50 with log likelihood = -67.7043720957.
# Saving particle to disk. N = 600.
# Creating level 51 with log likelihood = -67.4108570507.
# Creating level 52 with log likelihood = -67.1743097482.
# Creating level 53 with log likelihood = -67.0437169066.
# Saving particle to disk. N = 650.
# Creating level 54 with log likelihood = -66.9031504067.
# Creating level 55 with log likelihood = -66.8162153201.
# Saving particle to disk. N = 700.
# Creating level 56 with log likelihood = -66.7351516121.
# Creating level 57 with log likelihood = -66.6861308171.
# Creating level 58 with log likelihood = -66.6475767406.
# Saving particle to disk. N = 750.
# Saving particle to disk. N = 800.
# Creating level 59 with log likelihood = -66.6438667836.
# Creating level 60 with log likelihood = -66.5999952828.
# Creating level 61 with log likelihood = -66.5078991253.
# Saving particle to disk. N = 850.
# Creating level 62 with log likelihood = -66.4725988221.
# Creating level 63 with log likelihood = -66.449873787.
# Saving particle to disk. N = 900.
# Creating level 64 with log likelihood = -66.4417064907.
# Creating level 65 with log likelihood = -66.4334722182.
# Creating level 66 with log likelihood = -66.3804016247.
# Saving particle to disk. N = 950.
# Creating level 67 with log likelihood = -66.3647141342.
# Saving particle to disk. N = 1000.
# Creating level 68 with log likelihood = -66.3621243431.
# Saving particle to disk. N = 1050.
# Creating level 69 with log likelihood = -66.3562249902.
# Creating level 70 with log likelihood = -66.3511886558.
# Creating level 71 with log likelihood = -66.3461667773.
# Creating level 72 with log likelihood = -66.3292436207.
# Saving particle to disk. N = 1100.
# Creating level 73 with log likelihood = -66.3208195054.
# Creating level 74 with log likelihood = -66.3184282695.
# Creating level 75 with log likelihood = -66.3159936681.
# Saving particle to disk. N = 1150.
# Creating level 76 with log likelihood = -66.3138913132.
# Creating level 77 with log likelihood = -66.3132000391.
# Saving particle to disk. N = 1200.
# Creating level 78 with log likelihood = -66.3128331906.
# Creating level 79 with log likelihood = -66.3123046892.
# Creating level 80 with log likelihood = -66.3111048207.
# Creating level 81 with log likelihood = -66.3098328302.
# Saving particle to disk. N = 1250.
# Creating level 82 with log likelihood = -66.3039742294.
# Done creating levels.
# Saving particle to disk. N = 1300.
# Saving particle to disk. N = 1350.
# Saving particle to disk. N = 1400.
# Saving particle to disk. N = 1450.
# Saving particle to disk. N = 1500.
# Saving particle to disk. N = 1550.
# Saving particle to disk. N = 1600.
# Saving particle to disk. N = 1650.
# Saving particle to disk. N = 1700.
# Saving particle to disk. N = 1750.
# Saving particle to disk. N = 1800.
# Saving particle to disk. N = 1850.
# Saving particle to disk. N = 1900.
# Saving particle to disk. N = 1950.
# Saving particle to disk. N = 2000.
# Saving particle to disk. N = 2050.
# Saving particle to disk. N = 2100.
# Saving particle to disk. N = 2150.
# Saving particle to disk. N = 2200.
# Saving particle to disk. N = 2250.
# Saving particle to disk. N = 2300.
# Saving particle to disk. N = 2350.
# Saving particle to disk. N = 2400.
# Saving particle to disk. N = 2450.
# Saving particle to disk. N = 2500.
# Saving particle to disk. N = 2550.
# Saving particle to disk. N = 2600.
# Saving particle to disk. N = 2650.
# Saving particle to disk. N = 2700.
# Saving particle to disk. N = 2750.
# Saving particle to disk. N = 2800.
# Saving particle to disk. N = 2850.
# Saving particle to disk. N = 2900.
# Saving particle to disk. N = 2950.
# Saving particle to disk. N = 3000.
# Saving particle to disk. N = 3050.
# Saving particle to disk. N = 3100.
# Saving particle to disk. N = 3150.
# Saving particle to disk. N = 3200.
# Saving particle to disk. N = 3250.
# Saving particle to disk. N = 3300.
# Saving particle to disk. N = 3350.
# Saving particle to disk. N = 3400.
# Saving particle to disk. N = 3450.
# Saving particle to disk. N = 3500.
# Saving particle to disk. N = 3550.
# Saving particle to disk. N = 3600.
# Saving particle to disk. N = 3650.
# Saving particle to disk. N = 3700.
# Saving particle to disk. N = 3750.
# Saving particle to disk. N = 3800.
# Saving particle to disk. N = 3850.
# Saving particle to disk. N = 3900.
# Saving particle to disk. N = 3950.
# Saving particle to disk. N = 4000.
# Saving particle to disk. N = 4050.
# Saving particle to disk. N = 4100.
# Saving particle to disk. N = 4150.
# Saving particle to disk. N = 4200.
# Saving particle to disk. N = 4250.
# Saving particle to disk. N = 4300.
# Saving particle to disk. N = 4350.
# Saving particle to disk. N = 4400.
# Saving particle to disk. N = 4450.
# Saving particle to disk. N = 4500.
# Saving particle to disk. N = 4550.
# Saving particle to disk. N = 4600.
# Saving particle to disk. N = 4650.
# Saving particle to disk. N = 4700.
# Saving particle to disk. N = 4750.
# Saving particle to disk. N = 4800.
# Saving particle to disk. N = 4850.
# Saving particle to disk. N = 4900.
# Saving particle to disk. N = 4950.
# Saving particle to disk. N = 5000.
[1mkima[0m job finished, took 17.74 seconds
```

## Looking at the results

With the help of the `pykima`

package, let's load the results.

```
log(Z) = -105.45338674108966
Information = 34.84920858343051 nats.
Effective sample size = 642.9016675567867
[kima TIP] Use the 'kima-template' script to create a new bare-bones directory.
```

We can start by looking at the posterior for the orbital periods

which shows a very clear peak around 700 days. The joint posterior with the semi-amplitudes and eccentricities shows that we find a relatively eccentric signal with about 60 m/s.

The fit itself looks very good and compares well to the true simulated signal

```
# a convenience function to get an adequate array of times
tt = res._get_tt()
# parameters from BL2009
orbpar1 = (
700, # orbital period
60, # semi-amplitude
0.38, # eccentricity
3.10 + np.pi / 2, # BL2009 provide ϖ = 3.10
0.67 * np.pi, # the parameter λ = 0.67 is not defined in BL2009
res.M0_epoch # mean anomaly at the epoch (here, the first time)
)
V = 12.0 # systematic velocity
# the simulated planet signal
planet1 = V + pk.keplerian(tt, *orbpar1)
```

```
fig = res.plot_random_samples()
ax = fig.axes[0]
ax.plot(tt, planet1, 'r', label='truth')
ax.legend();
```

## Afterword

#### Package versions

```
Last updated: Wed Oct 19 2022
Python implementation: CPython
Python version : 3.9.7
IPython version : 8.2.0
pykima: 0.1.33
```

#### Input files

```
#include "kima.h"
RVmodel::RVmodel():fix(false),npmax(1)
{
// priors as in Balan & Lahav (2009, DOI: 10.1111/j.1365-2966.2008.14385.x)
Cprior = make_prior<Uniform>(-2000, 2000); // m/s
Jprior = make_prior<ModifiedLogUniform>(1.0, 2000.); // m/s
auto conditional = planets.get_conditional_prior();
conditional->Pprior = make_prior<LogUniform>(0.2, 15E3); // days
conditional->Kprior = make_prior<ModifiedLogUniform>(1.0, 2E3); // m/s
conditional->eprior = make_prior<Uniform>(0., 1.);
conditional->phiprior = make_prior<Uniform>(0.0, 2*M_PI);
conditional->wprior = make_prior<Uniform>(0.0, 2*M_PI);
}
int main(int argc, char** argv)
{
datafile = "BL2009_dataset1.kms.rv";
load(datafile, "kms", 0);
Sampler<RVmodel> sampler = setup<RVmodel>(argc, argv);
sampler.run(50);
return 0;
}
```

```
from pykima.model import RVmodel
# priors from pykima.priors or from scipy.stats can be used interchangibly
from pykima.priors import ModifiedLogUniform
from scipy.stats import loguniform, uniform
model = RVmodel(fix=False, npmax=1)
# priors as in Balan & Lahav (2009, DOI: 10.1111/j.1365-2966.2008.14385.x)
model.priors.C = uniform(-2000, 2000) # m/s
model.priors.J = ModifiedLogUniform(1.0, 2000) # m/s
model.priors.P = loguniform(0.2, 15e3) # days
model.priors.K = ModifiedLogUniform(1.0, 2000) # m/s
# the following priors are the default ones, so could be left out
model.priors.e = uniform(0.0, 1.0)
model.priors.phi = uniform(0, '2*PI')
model.priors.w = uniform(0, '2*PI')
model.units = 'kms'
model.data = 'BL2009_dataset1.kms.rv'
model.save()
# run the model
model.run()
# and load the results
res = model.results
```

```
# File containing parameters for DNest4
# Put comments at the top, or at line ends
2 # Number of particles
5000 # new level interval
2000 # save interval
100 # threadSteps: number of steps each thread does independently before communication
0 # maximum number of levels
10 # Backtracking scale length (lambda)
100 # Strength of effect to force histogram to equal push (beta)
5000 # Maximum number of saves (0 = infinite)
# (optional) samples file
# (optional) sample_info file
# (optional) levels file
```