3

I'm simulating a 2D Ornstein-Uhlenbeck process (Langevin equation for velocity), and I'm interested in computing the power spectral density (PSD) of the vector-valued velocity process.

Following the Wiener–Khinchin theorem, I compute the velocity autocorrelation function (VACF) defined as:

$$ R(\tau) = \frac{1}{T - \tau} \int_0^{T - \tau} \langle \mathbf{v}(t) \cdot \mathbf{v}(t + \tau) \rangle , dt $$

Then I apply the Fast Fourier Transform (FFT) to obtain the PSD. However, since VACF is only computed for positive lags $\tau \geq 0$, I need to mirror the VACF manually to make it symmetric before applying np.fft.fft.

That mirroring step feels like a workaround, and I'm wondering:

  • Is there a correct or more natural way to compute the PSD of a vector-valued process without having to explicitly clone the VACF?
  • Or alternatively, is there a shortcut that computes the PSD directly from time series without computing the VACF at all?

Below is my working example. I calculate the VACF as the average dot product over time for all trajectories, and then mirror it to compute the PSD via FFT.

1. Simulation of 2D Langevin trajectories

import numpy as np
import matplotlib.pyplot as plt
import scipy.signal

np.random.seed(0)
n = 10_000      # Number of time steps
dt = .1         # Time step size (delta t)
dim = 2         # Dimension of the velocity vector (2D: x and y)
N = 10          # Ensemble size
gamma = 1       # Damping coefficient in Langevin equation
sigma = 1       # Noise intensity

dW = np.random.normal(scale=sigma * np.sqrt(dt), size=(n, dim, N)) # Gaussian noise
v = np.zeros((n, dim, N)) # Velocity - shape (time, dimension, trajectories)
r = np.zeros((n, dim, N)) # Position - shape (time, dimension, trajectories)

# Integrate the Langevin equation using the Euler-Maruyama method
for i in range(n - 1)
    v[i + 1] = v[i] - gamma * v[i] * dt + dW[i]  # Update velocity: damping + noise
    r[i + 1] = r[i] + v[i] * dt                  # Update position: Euler integration of velocity

2. VACF and PSD computation

# Computes the VACF: ⟨v(t) · v(t + τ)⟩
def manual_vacf(v, lag):
    vacf = []                           # VACF for the ensemble
    for i in range(v.shape[-1]):        # Loop over trajectories
        v_ = v[..., i]                  # Select trajectory i (shape: [time, dim])
        vacf_ = np.empty(lag)           # VACF of current trajectory
        for lag_ in range(1, lag + 1):  # Loop over lag values
            v1, v2 = v_[:-lag_], v_[lag_:] # Vectors at t and t+lag
            v1v2 = (v1 - v1.mean(axis=0)) * (v2 - v2.mean(axis=0)) # Remove mean and take element wise product
            v1_dot_v2 = np.sum(v1v2, axis=1)      # Dot product v(t) · v(t + τ)
            vacf_[lag_ - 1] = np.mean(v1_dot_v2)  # Time average
        vacf.append(vacf_)
    return np.mean(vacf, axis=0)  # Ensemble average

# Mirrors the VACF and computes the PSD using the Wiener–Khinchin theorem
def psd_wiener_khinchin(vacf, lag, dt):
    vacf_mirror = np.hstack([np.flip(vacf), vacf])  # Extend VACF to negative lags
    ft = np.fft.fft(vacf_mirror)[:lag]              # FFT and keep only non-negative frequencies
    freq = np.fft.fftfreq(2 * lag, dt)[:lag]        # Frequency values
    psd = np.abs(ft) * dt                           # PSD (magnitude × dt)
    return psd, freq

# === vacf ===
lag_time = 4
lag = int(lag_time / dt)
tau = dt * np.arange(lag)
vacf_manual = manual_vacf(v, lag)
tau_theory = dt * np.arange(-lag, lag)
vacf_theory = theoretical_vacf(tau_theory, gamma, sigma)

# === psd ===
psd_manual, freq_manual = psd_wiener_khinchin(vacf_manual, lag, dt)
psd_theory = theoretical_psd(freq_manual, gamma, sigma)

3. Plotting

fig, axs = plt.subplots(1, 3, figsize=(10, 2))

# first 3 simulated trajectories
for i in range(3):
    axs[0].plot(*r[..., i].T)
axs[0].axis('equal')
axs[0].set_xlabel('x')
axs[0].set_ylabel('y')

# vacf
axs[1].plot(tau, vacf_manual)
axs[1].plot(tau_theory, vacf_theory)
axs[1].set_xlim(-lag_time, lag_time)
axs[1].set_xlabel('lag time')
axs[1].set_ylabel('VACF')

# psd
axs[2].plot(freq_manual, psd_manual, label='simulation')
axs[2].plot(freq_manual, psd_theory, label='theory')
axs[2].set_yscale('log')
axs[2].set_xscale('log')
axs[2].set_xlabel('frequency')
axs[2].set_ylabel('PSD')
axs[2].legend()

plt.show()

enter image description here

2
  • Consider asking this at the SignalProcessing StackExchange. Commented Jun 1 at 14:31
  • I managed to get something similar to what you want using welch's method. But can you please also include the theoretical_psd() function in your MRE? Commented Jun 2 at 7:35

1 Answer 1

2

I'll post my answer here using Welch's method, would be nice to compare it with the theoretical one that you have.


The appraoch:

# remember to import welch from scipy.signal
fs = 1 / dt  # sampling frequency from time step
nperseg = 256  # <- the lower the smoother, since it averages even more PSDs
psd_list = [] # init list
for j in range(N): # for each ensemble
    for d in range(dim): # for x and y
        _, Pxx = welch(v[:, d, j], fs=fs, nperseg=nperseg) # compute welch's method on each column
        psd_list.append(Pxx) # append the results
psd_welch = np.mean(psd_list, axis=0) # average them
freq_welch = np.linspace(0, fs/2, len(Pxx)) # frequency vector 

Results so far:

wiener kinchin vs welch


Note: specifying a good value for nperseg is always a trial and error thing. But you can take an educated guess if you already know the following:

  • The lower nperseg is, the smoother the PSD looks. But this also means that some of the energy is spread into other frequencies (spectral leakage)
  • If, on the other hand, you value a higher frequency resolution and a more accurate representation of the values, then a higher nperseg should do the trick.
Sign up to request clarification or add additional context in comments.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.