niklas@web: ~/Playground

🎰 Playground   TAG

This is a place where I’m playing around with different org features and the org publish function.

✒ LaTeX

\(\LaTeX\) support: \(\varphi = \sum_{i} a_i^2 + b_i^2\).

Markov Chains

Definition

A sequence of random variables \(X_1, \dots X_T\) which fulfills the Markov property: \[ P(X_t \mid X_1, \dots, X_{t-1}) = P(X_t \mid X_{t-1})\] where

  • Time indices \(t\) are discrete
  • Assume that the random variables \(X_t\) are discrete

Joint distribution: \[ P(X_t = i_1, \dots, X_T = i_T) = P(X_1 = i_1) \prod_{t=1}^{T-1} P(X_{t+1} = i_{t+1} \mid X_t = i_t) \]

General Case

\[ P(X_1 = i) = \pi_i \\ P(X_{t+1} = j \mid X_t = i) = A_{ij}^{(t+1)} \]

where \(\pi \in \mathbb{R}^K\) is a prior probability on the initial state and \(A^{(t)} \in \mathbb{R}^{K \times K}\) are the transition matrices.

Thus, we have a joint probability of \[ P(X_1=i_1, \dots, X_T=i_T) = \pi_{i1} \times A_{i_1,i_2}^{(2)} \times \dots \times A^{(T)}_{i_{T-1}, i_T} \]

Stationary Case

To simplify, assume a time-homogeneous or stationary Markov Chain: \[ P(X_1 = i) = \pi_i \\ P(X_{t+1} = j \mid X_t = i) = A_{ij} \]

The tranisiton matrix \(A^{(t)} = A\) does not depend on \(t\).

🔗 Links

Links can be written in plain text: http://www.niklasbuehler.com, or formatted: Home.

💻 Code

Bash with text output:

echo "A"
A

Python with image output:

import matplotlib
import matplotlib.pyplot as plt
fig=plt.figure(figsize=(3,2))
plt.plot([1,3,2])
fig.tight_layout()

fname = "res/img/myfig.png"
plt.savefig(fname)
fname # return this to org-mode

myfig.png

🧮 Automatic Spreadsheet formatting

Name Grade ECTS Weighted Grade
Spanish B1.1 1.0 3 3.
MLRG 1.3 6 7.8
TN 2.3 5 11.5
Total 1.5333333 14 1.5928571

Column View

Item 1

Item 2

Item 3