A PyMC-inspired probabilistic programming library for Bayesian inference in JavaScript. Built on TensorFlow.js with automatic differentiation support for efficient MCMC sampling.
MC brings the power of Bayesian statistical modeling to JavaScript, providing an intuitive API similar to PyMC for defining probabilistic models as Directed Acyclic Graphs (DAGs) and performing inference using Markov Chain Monte Carlo methods.
- PyMC-like DAG structure: Define models by connecting distributions in a directed acyclic graph
- TensorFlow.js integration: Automatic differentiation for gradient-based samplers
- Multiple MCMC samplers: Metropolis-Hastings and Hamiltonian Monte Carlo
- Rich distribution library: Normal, Uniform, Beta, Gamma, Bernoulli, and more
- Gaussian Processes: Non-parametric regression with multiple kernel functions (RBF, Matérn, Periodic)
- Posterior predictions: Generate predictions with uncertainty from MCMC samples
- Model persistence: Save and load traces and model configurations to JSON
- Trace analysis utilities: Summary statistics, effective sample size, convergence diagnostics
- Hierarchical models: Support for multilevel Bayesian models
- Browser compatible: Run in Node.js or in the browser (including ObservableHQ)
npm install @tangent.to/mcimport { Model, Normal, MetropolisHastings } from "npm:@tangent.to/mc";mc = import("https://cdn.jsdelivr.net/npm/@tangent.to/mc/src/browser.js")Or add to your package.json:
{
"dependencies": {
"@tangent.to/mc": "^0.2.0"
}
}Here's a simple Bayesian linear regression example:
import { Model, Normal, Uniform, MetropolisHastings, printSummary } from '@tangent.to/mc';
// Create model
const model = new Model('linear_regression');
// Define priors (PyMC-like syntax)
const alpha = new Normal(0, 10, 'alpha');
const beta = new Normal(0, 10, 'beta');
const sigma = new Uniform(0.01, 5, 'sigma');
model.addVariable('alpha', alpha);
model.addVariable('beta', beta);
model.addVariable('sigma', sigma);
// Define likelihood (connecting distributions in a DAG)
model.logProb = function(params) {
let logProb = alpha.logProb(params.alpha)
.add(beta.logProb(params.beta))
.add(sigma.logProb(params.sigma));
// Add likelihood for observations
for (let i = 0; i < x.length; i++) {
const mu = params.alpha + params.beta * x[i];
const likelihood = new Normal(mu, params.sigma);
logProb = logProb.add(likelihood.logProb(y[i]));
}
return logProb;
};
// Run MCMC sampling
const sampler = new MetropolisHastings(0.5);
const trace = sampler.sample(model, initialValues, 1000, 500, 1);
// Analyze results
printSummary(trace);Like PyMC, JSMC uses a Directed Acyclic Graph (DAG) structure to represent probabilistic models. Variables can depend on other variables, creating a natural flow from priors through transformations to likelihoods:
// Hyperpriors
const mu_global = new Normal(0, 10);
const sigma_global = new Uniform(0, 5);
// Group-level parameters (depend on hyperpriors)
const mu_group = new Normal(mu_global, sigma_global);
// Observations (depend on group parameters)
const y = new Normal(mu_group, sigma_obs);JSMC provides a rich set of probability distributions:
- Normal:
new Normal(mu, sigma)- Gaussian distribution - Uniform:
new Uniform(lower, upper)- Uniform distribution - Beta:
new Beta(alpha, beta)- Beta distribution (for probabilities) - Gamma:
new Gamma(alpha, beta)- Gamma distribution (for positive values)
- Bernoulli:
new Bernoulli(p)- Binary outcomes
All distributions support:
logProb(value)- Compute log probability density/masssample(shape)- Generate random samplesmean()- Get the distribution meanvariance()- Get the distribution variance
JSMC includes a full implementation of Gaussian Processes for non-parametric regression:
import { GaussianProcess, RBF, Matern32 } from '@tangent.to/mc';
// Create GP with RBF kernel
const kernel = new RBF(lengthscale=1.0, variance=1.0);
const gp = new GaussianProcess(meanFunction=0, kernel, noiseVariance=0.01);
// Fit to data
gp.fit(X_train, y_train);
// Make predictions
const predictions = gp.predict(X_test, returnStd=true);
// Returns: { mean: [...], std: [...] }
// Sample functions from posterior
const posteriorSamples = gp.samplePosterior(X_test, nSamples=5);Available Kernels:
- RBF (Squared Exponential): Smooth, infinitely differentiable functions
- Matern32: Less smooth than RBF, once differentiable
- Matern52: Middle ground between Matern32 and RBF
- Periodic: For periodic/seasonal patterns
- Linear: For linear trends
Generate posterior predictive samples for new data:
// Define prediction function
const predictFn = (params) => {
return params.alpha + params.beta * x_new;
};
// Get posterior predictions with uncertainty
const predictions = model.predictPosteriorSummary(
trace,
predictFn,
credibleInterval=0.95
);
// Returns: { mean: [...], lower: [...], upper: [...] }Save and load model states and traces:
import { saveTrace, loadTrace, saveModelState } from '@tangent.to/mc';
// Save trace to JSON
saveTrace(trace, 'trace.json');
// Load trace
const loadedTrace = loadTrace('trace.json');
// Save complete model state
saveModelState(model, trace, 'model_state.json');
// Export for browser (no filesystem)
const jsonString = exportTraceForBrowser(trace);A simple but effective random-walk sampler:
const sampler = new MetropolisHastings(proposalStd);
const trace = sampler.sample(model, initialValues, nSamples, burnIn, thin);Parameters:
proposalStd: Standard deviation of the Gaussian proposal distributionnSamples: Number of samples to collectburnIn: Number of initial samples to discardthin: Keep every nth sample
Best for: Simple models, initial exploration
A gradient-based sampler that uses automatic differentiation:
const sampler = new HamiltonianMC(stepSize, nSteps);
const trace = sampler.sample(model, initialValues, nSamples, burnIn, thin);Parameters:
stepSize: Leapfrog integration step size (epsilon)nSteps: Number of leapfrog steps (L)
Best for: Complex models with many parameters, faster convergence
JSMC provides utilities for analyzing MCMC samples:
import { summarize, effectiveSampleSize, gelmanRubin, printSummary } from '@tangent.to/mc';
// Print comprehensive summary
printSummary(trace);
// Get statistics for a variable
const stats = summarize(trace.trace.alpha);
// Returns: { mean, median, std, variance, hdi_2_5, hdi_97_5, n }
// Compute effective sample size
const ess = effectiveSampleSize(trace.trace.alpha);
// Check convergence with multiple chains
const rHat = gelmanRubin([chain1.alpha, chain2.alpha, chain3.alpha]);The examples/ directory contains complete working examples:
node examples/linear_regression.jsDemonstrates basic Bayesian linear regression with normal priors.
node examples/logistic_regression.jsBinary classification with a logistic link function.
node examples/hierarchical_model.jsMultilevel model with partial pooling across groups, showcasing complex DAG structures.
node examples/gaussian_process.jsNon-parametric regression using Gaussian Processes with different kernels and uncertainty quantification.
const model = new Model(name)Methods:
addVariable(name, distribution, observed)- Add a variable to the modelgetVariable(name)- Retrieve a variablelogProb(params)- Compute log probabilitylogProbAndGradient(params)- Compute log prob and gradientssamplePrior(nSamples)- Sample from prior distributionsgetFreeVariableNames()- Get unobserved variable namessummary()- Print model structure
All distributions inherit from the base Distribution class:
class Distribution {
logProb(value) // Log probability
sample(shape) // Generate samples
observe(data) // Set observed data
mean() // Distribution mean
variance() // Distribution variance
}class MetropolisHastings {
constructor(proposalStd)
sample(model, initialValues, nSamples, burnIn, thin)
tuneProposal(acceptanceRate)
}
class HamiltonianMC {
constructor(stepSize, nSteps)
sample(model, initialValues, nSamples, burnIn, thin)
}JSMC works seamlessly in browser environments, including ObservableHQ notebooks:
// In Observable, import from npm
jsmc = import("https://cdn.jsdelivr.net/npm/jsmc/src/browser.js")
// Use it!
{
const { Model, Normal, MetropolisHastings } = jsmc;
// ... define and run your model
}Key differences in browser:
- Uses
@tensorflow/tfjsinstead of@tensorflow/tfjs-node - File I/O functions (
saveTrace,loadTrace) not available - Use
exportTraceForBrowser()and download as JSON instead - Slightly slower than Node.js, but enables interactive visualization
See docs/OBSERVABLE.md for detailed Observable examples and best practices.
JSMC leverages TensorFlow.js for:
- Automatic differentiation: Essential for gradient-based samplers like HMC
- Efficient tensor operations: Fast computation of log probabilities
- GPU acceleration: Optional GPU support for large-scale models
| Feature | PyMC | JSMC |
|---|---|---|
| Language | Python | JavaScript |
| Backend | Aesara/JAX | TensorFlow.js |
| DAG Structure | Yes | Yes |
| MCMC Samplers | NUTS, HMC, MH | HMC, MH |
| Variational Inference | Yes | Planned |
| GPU Support | Yes | Yes (via TF.js) |
-
Tune sampler parameters:
- MH: Aim for 20-40% acceptance rate by adjusting
proposalStd - HMC: Start with small
stepSize(~0.01) and moderatenSteps(~10)
- MH: Aim for 20-40% acceptance rate by adjusting
-
Use appropriate burn-in: Discard at least 500-1000 initial samples
-
Check convergence:
- Visual inspection of trace plots
- R-hat < 1.1 for multiple chains
- Effective sample size > 100 per chain
-
Hierarchical models: Use HMC for faster convergence with many parameters
# Clone repository
git clone https://github.com/tangent-to/mc.git
cd jsmc
# Install dependencies
npm install
# Run examples
npm run example
# Run tests
npm testContributions are welcome! Please feel free to submit issues and pull requests.
Apache-2.0
Completed in v0.2.0:
- Gaussian Processes with multiple kernels
- Posterior predictive sampling
- Model persistence (save/load)
- Browser/Observable support
Planned:
- Additional distributions (Poisson, Student-t, Exponential)
- NUTS (No-U-Turn Sampler)
- Variational inference (ADVI)
- Sparse GPs (inducing points for large datasets)
- Model comparison utilities (WAIC, LOO)
- Trace visualization tools
- PyMC model import/export
- Observable Guide - Using JSMC in ObservableHQ notebooks
- Considerations - Best practices, limitations, and design decisions
- Examples - Complete working examples
- PyMC Documentation
- TensorFlow.js
- Bayesian Data Analysis (Gelman et al.)
- MCMC sampling for dummies
- Gaussian Processes for Machine Learning
If you use JSMC in your research, please cite:
@software{jsmc,
title = {JSMC: JavaScript Markov Chain Monte Carlo},
author = {},
year = {2025},
url = {https://github.com/tangent-to/mc}
}