Skip to content

Trends

Boilerplate
import numpy as np
import matplotlib.pyplot as plt
import pykima as pk

This example showcases the linear, quadratic, and cubic RV trends in kima, using a simulated dataset with a known (quadratic) trend.

Generate data

Let's create a very simple simulated dataset.

# almost the answer to the Ultimate Question of Life, the Universe, and Everything
np.random.seed(41)

vsys  = 11.1  # [m/s]
slope = 0.7   # [m/s/yr]
quadr = 0.15  # [m/s/yr²]

# transform yr --> days
slope_msd = slope / 365.25
quadr_msd2 = quadr / 365.25**2

truths = [quadr_msd2, slope_msd, vsys]
jit = 0.3

N = 46

t = np.sort(np.random.uniform(55256, 58507, N))
tmiddle = t[0] + 0.5 * t.ptp()

rv = np.polyval([quadr_msd2, slope_msd, vsys], t - tmiddle)
srv = np.random.uniform(0.6, 0.9, t.size)
err = np.random.normal(0, np.hypot(np.median(srv), jit), t.size)
rv += err

which looks like this

tt = np.linspace(t[0], t[-1], 1000)
fig, ax = plt.subplots()
ax.errorbar(t, rv, srv, fmt='ko', label='data')
ax.plot(tt, np.polyval([quadr_msd2, slope_msd, vsys], tt - tmiddle), label='truth')
ax.set(xlabel='Time [days]', ylabel='RV [m/s]')
ax.legend();

png

Save the data to a file

np.savetxt('d1.txt', np.c_[t, rv, srv], fmt='%6.3f')

Defining and running the model

The kima_setup files set the options for the model. Everything is default except for

const bool trend = true;
const int degree = 3;
model.trend = True
model.degree = 3

which allows for up to a cubic trend in the model. The number of planets is fixed to 0, and all priors take their default values.

Let's use the kima-run script to run the model.
We use a fixed seed in order to reproduce the results. With the default options, this example takes only a few seconds to finish.

!kima-run -s 12345
compiling... done! (in 4.4 sec)
starting kima
# Loaded 46 data points from file d1.txt
# Using 4 threads.
# Target compression factor between levels = 2.7182818284590451
# Seeding random number generators. First seed = 12345.
# Generating 8 particles from the prior...done.
# Creating level 1 with log likelihood = -126.545553779.
# Creating level 2 with log likelihood = -104.738012103.
# Creating level 3 with log likelihood = -94.522777143.
# Creating level 4 with log likelihood = -84.5251787125.
# Creating level 5 with log likelihood = -74.8541178692.
# Creating level 6 with log likelihood = -70.0747008839.
# Creating level 7 with log likelihood = -65.2720849945.
# Creating level 8 with log likelihood = -60.8531864646.
# Creating level 9 with log likelihood = -59.6306649841.
# Creating level 10 with log likelihood = -58.3727146053.
# Creating level 11 with log likelihood = -58.0145272523.
# Creating level 12 with log likelihood = -57.0976475517.
# Saving particle to disk. N = 100.
# Creating level 13 with log likelihood = -56.6417440767.
# Creating level 14 with log likelihood = -56.2831759831.
# Creating level 15 with log likelihood = -55.6820625113.
# Creating level 16 with log likelihood = -55.408953947.
# Creating level 17 with log likelihood = -55.2333839767.
# Creating level 18 with log likelihood = -55.1120759943.
# Creating level 19 with log likelihood = -55.1001934961.
# Creating level 20 with log likelihood = -54.984109184.
# Creating level 21 with log likelihood = -54.9291857994.
# Creating level 22 with log likelihood = -54.9116204981.
# Saving particle to disk. N = 200.
# Creating level 23 with log likelihood = -54.9020579865.
# Creating level 24 with log likelihood = -54.8878593289.
# Creating level 25 with log likelihood = -54.849415315.
# Creating level 26 with log likelihood = -54.8334420202.
# Creating level 27 with log likelihood = -54.804964525.
# Creating level 28 with log likelihood = -54.766342534.
# Creating level 29 with log likelihood = -54.7574286219.
# Saving particle to disk. N = 300.
# Creating level 30 with log likelihood = -54.7468658426.
# Creating level 31 with log likelihood = -54.7340414385.
# Saving particle to disk. N = 400.
# Creating level 32 with log likelihood = -54.7329051185.
# Creating level 33 with log likelihood = -54.7321082941.
# Creating level 34 with log likelihood = -54.732015564.
# Saving particle to disk. N = 500.
# Creating level 35 with log likelihood = -54.7313560301.
# Creating level 36 with log likelihood = -54.7306395066.
# Creating level 37 with log likelihood = -54.729973747.
# Saving particle to disk. N = 600.
# Creating level 38 with log likelihood = -54.7298519163.
# Creating level 39 with log likelihood = -54.729791538.
# Creating level 40 with log likelihood = -54.7291716915.
# Creating level 41 with log likelihood = -54.7280557806.
# Creating level 42 with log likelihood = -54.726896811.
# Creating level 43 with log likelihood = -54.7262979072.
# Saving particle to disk. N = 700.
# Creating level 44 with log likelihood = -54.726092495.
# Creating level 45 with log likelihood = -54.7253421268.
# Creating level 46 with log likelihood = -54.7253042354.
# Creating level 47 with log likelihood = -54.725248964.
# Saving particle to disk. N = 800.
# Creating level 48 with log likelihood = -54.7252356242.
# Creating level 49 with log likelihood = -54.7252353052.
# Done creating levels.
# Saving particle to disk. N = 900.
# Saving particle to disk. N = 1000.
# Saving particle to disk. N = 1100.
# Saving particle to disk. N = 1200.
# Saving particle to disk. N = 1300.
# Saving particle to disk. N = 1400.
# Saving particle to disk. N = 1500.
# Saving particle to disk. N = 1600.
# Saving particle to disk. N = 1700.
# Saving particle to disk. N = 1800.
# Saving particle to disk. N = 1900.
# Saving particle to disk. N = 2000.
# Saving particle to disk. N = 2100.
# Saving particle to disk. N = 2200.
# Saving particle to disk. N = 2300.
# Saving particle to disk. N = 2400.
# Saving particle to disk. N = 2500.
# Saving particle to disk. N = 2600.
# Saving particle to disk. N = 2700.
# Saving particle to disk. N = 2800.
# Saving particle to disk. N = 2900.
# Saving particle to disk. N = 3000.
# Saving particle to disk. N = 3100.
# Saving particle to disk. N = 3200.
# Saving particle to disk. N = 3300.
# Saving particle to disk. N = 3400.
# Saving particle to disk. N = 3500.
# Saving particle to disk. N = 3600.
# Saving particle to disk. N = 3700.
# Saving particle to disk. N = 3800.
# Saving particle to disk. N = 3900.
# Saving particle to disk. N = 4000.
# Saving particle to disk. N = 4100.
# Saving particle to disk. N = 4200.
# Saving particle to disk. N = 4300.
# Saving particle to disk. N = 4400.
# Saving particle to disk. N = 4500.
# Saving particle to disk. N = 4600.
# Saving particle to disk. N = 4700.
# Saving particle to disk. N = 4800.
# Saving particle to disk. N = 4900.
# Saving particle to disk. N = 5000.
# Saving particle to disk. N = 5100.
# Saving particle to disk. N = 5200.
# Saving particle to disk. N = 5300.
# Saving particle to disk. N = 5400.
# Saving particle to disk. N = 5500.
# Saving particle to disk. N = 5600.
# Saving particle to disk. N = 5700.
# Saving particle to disk. N = 5800.
# Saving particle to disk. N = 5900.
# Saving particle to disk. N = 6000.
# Saving particle to disk. N = 6100.
# Saving particle to disk. N = 6200.
# Saving particle to disk. N = 6300.
# Saving particle to disk. N = 6400.
# Saving particle to disk. N = 6500.
# Saving particle to disk. N = 6600.
# Saving particle to disk. N = 6700.
# Saving particle to disk. N = 6800.
# Saving particle to disk. N = 6900.
# Saving particle to disk. N = 7000.
# Saving particle to disk. N = 7100.
# Saving particle to disk. N = 7200.
# Saving particle to disk. N = 7300.
# Saving particle to disk. N = 7400.
# Saving particle to disk. N = 7500.
# Saving particle to disk. N = 7600.
# Saving particle to disk. N = 7700.
# Saving particle to disk. N = 7800.
# Saving particle to disk. N = 7900.
# Saving particle to disk. N = 8000.
# Saving particle to disk. N = 8100.
# Saving particle to disk. N = 8200.
# Saving particle to disk. N = 8300.
# Saving particle to disk. N = 8400.
# Saving particle to disk. N = 8500.
# Saving particle to disk. N = 8600.
# Saving particle to disk. N = 8700.
# Saving particle to disk. N = 8800.
# Saving particle to disk. N = 8900.
# Saving particle to disk. N = 9000.
# Saving particle to disk. N = 9100.
# Saving particle to disk. N = 9200.
# Saving particle to disk. N = 9300.
# Saving particle to disk. N = 9400.
# Saving particle to disk. N = 9500.
# Saving particle to disk. N = 9600.
# Saving particle to disk. N = 9700.
# Saving particle to disk. N = 9800.
# Saving particle to disk. N = 9900.
# Saving particle to disk. N = 10000.
kima job finished, took 13.60 seconds

Looking at the results

With the help of the pykima package, let's load the results.

res = pk.load()
log(Z) = -65.66342647189445
Information = 8.862777761920654 nats.
Effective sample size = 1999.1290060648694

We got about 2000 posterior samples.

We can look at the posterior distributions for the trend parameters, which we can compare to the true values

fig = res.hist_trend();
fig.set_size_inches((10, 8))

axs = fig.axes
axs[0].axvline(slope, color='r')
axs[1].axvline(quadr, color='r');

png

We recovered the true values for the slope and the quadratic coefficient, and the cubic coefficient is consistent with zero. The fit also looks good and close to the original curve.

fig = res.plot_random_samples(ncurves=100)
ax = fig.axes[0]
ax.plot(tt, np.polyval(truths, tt - tmiddle), 'r', label='truth')
ax.legend();

png


Afterword

Package versions

%load_ext watermark
%watermark -n -u -v -p pykima
Last updated: Wed Oct 19 2022

Python implementation: CPython
Python version       : 3.9.7
IPython version      : 8.2.0

pykima: 0.1.33

Input files

#include "kima.h"

RVmodel::RVmodel():fix(true),npmax(0)
{
    trend = true;
    degree = 3;

    // all default priors
}


int main(int argc, char** argv)
{
    datafile = "d1.txt";

    load(datafile, "ms", 0);

    Sampler<RVmodel> sampler = setup<RVmodel>(argc, argv);
    sampler.run(100);

    return 0;
}
from pykima.model import RVmodel

model = RVmodel(fix=True, npmax=0)

model.trend = True
model.degree = 3

model.data = 'd1.txt'

model.save()
model.run()
# File containing parameters for DNest4
# Put comments at the top, or at line ends
2   # Number of particles
1000    # new level interval
200 # save interval
100 # threadSteps: number of steps each thread does independently before communication
0   # maximum number of levels
10  # Backtracking scale length (lambda)
100 # Strength of effect to force histogram to equal push (beta)
10000   # Maximum number of saves (0 = infinite)
    # (optional) samples file
    # (optional) sample_info file
    # (optional) levels file

Last update: 2022-11-15