Agent Skills: monad-bayes × ASI Interleave

Bridge layer connecting tweag/monad-bayes to plurigrid/asi. Routes SMC/MCMC/PMMH/RMSMC monad transformer stacks into asi's abductive reasoning, lolita physics emulation, and GF(3)-colored sampling capabilities.

UncategorizedID: plurigrid/asi/monad-bayes-asi-interleave

Install this agent skill to your local

pnpm dlx add-skill https://github.com/plurigrid/asi/tree/HEAD/skills/monad-bayes-asi-interleave

Skill Files

Browse the full folder contents for monad-bayes-asi-interleave.

Download Skill

Loading file tree…

skills/monad-bayes-asi-interleave/SKILL.md

Skill Metadata

Name
monad-bayes-asi-interleave
Description
Bridge layer connecting tweag/monad-bayes to plurigrid/asi. Routes SMC/MCMC/PMMH/RMSMC monad transformer stacks into asi's abductive reasoning, lolita physics emulation, and GF(3)-colored sampling capabilities.

monad-bayes × ASI Interleave

Bridge connecting tweag/monad-bayes (probabilistic inference as monad transformer stacks) to the ASI skill graph (GF(3)-colored capability system).

Monad Transformer Stack Anatomy (from DeepWiki)

WeightedT m a           ← innermost: accumulates likelihood (MonadFactor)
  ↑
PopulationT m a         ← = WeightedT (ListT m) a; manages particle populations
  ↑
SequentialT m a         ← sequential time steps; works with PopulationT
  ↑
TracedT m a             ← records execution trace for MH proposals
  ├── Static.TracedT:   model :: WeightedT (DensityT m) a
  ├── Basic.TracedT:    model :: WeightedT (DensityT Identity) a
  └── Dynamic.TracedT:  m (WeightedT (DensityT m) a, Trace a)

DensityT (free monad)   ← categorical structure: computes joint density of traces
                          mhTransFree :: WeightedT (Free.DensityT m) a → Trace a → ...
=======
description: >
  Bridge connecting tweag/monad-bayes probabilistic inference (SMC/MCMC/PMMH/RMSMC monad
  transformer stacks) to ASI abductive reasoning and sampling capabilities.
  Use when wiring Haskell probabilistic programming into ASI pipelines,
  running SMC over hypothesis spaces, or performing PMMH parameter inference
  for physics emulation.
---

# monad-bayes x ASI Interleave

Bridge connecting tweag/monad-bayes (probabilistic inference as monad transformer stacks) to the ASI skill graph.

## Monad Transformer Stack Anatomy

WeightedT m a <- innermost: accumulates likelihood (MonadFactor) | PopulationT m a <- = WeightedT (ListT m) a; manages particle populations | SequentialT m a <- sequential time steps; works with PopulationT | TracedT m a <- records execution trace for MH proposals |-- Static.TracedT: model :: WeightedT (DensityT m) a |-- Basic.TracedT: model :: WeightedT (DensityT Identity) a +-- Dynamic.TracedT: m (WeightedT (DensityT m) a, Trace a)

DensityT (free monad) <- categorical structure: computes joint density of traces

origin/main


### Algorithm Compositions

| Algorithm | Stack |
|-----------|-------|
| **SMC** | `SequentialT (PopulationT m)` |
| **MCMC** | `TracedT (WeightedT m)` |
<<<<<<< HEAD
| **PMMH** | `TracedT (WeightedT m)` (params) ⊗ `SequentialT (PopulationT (WeightedT m))` (state) |
| **RMSMC** | `SequentialT (TracedT (PopulationT m))` |

## GF(3) Tripartite Tag

`smc-population(-1) ⊗ monad-bayes-asi-interleave(0) ⊗ abductive-monte-carlo(+1) = 0`

Validation (-1) × Bridge (0) × Generation (+1) = balanced inference loop.

---

## ASI Integration Points

### 1. abductive-monte-carlo → monad-bayes SMC Backend

`abductive-monte-carlo` samples hypotheses via MCMC; swap in monad-bayes as the backend:

```haskell
-- Hypothesis prior: GF(3)-colored belief states
data Hypothesis = H { content :: Text, trit :: Int }

-- monad-bayes model: prior over hypotheses
hypothesisPrior :: MonadInfer m => Text -> m Hypothesis
hypothesisPrior query = do
  -- Sample trit from GF(3)
  t <- categorical (V.fromList [1/3, 1/3, 1/3])
  let tritVal = [-1, 0, 1] !! t
  -- Gemini scores likelihood (via vertex-asi-interleave)
=======
| **PMMH** | `TracedT (WeightedT m)` (params) x `SequentialT (PopulationT (WeightedT m))` (state) |
| **RMSMC** | `SequentialT (TracedT (PopulationT m))` |

## ASI Integration Points

### 1. abductive-monte-carlo -> monad-bayes SMC Backend

Swap in monad-bayes as the backend for hypothesis sampling:

```haskell
data Hypothesis = H { content :: Text, trit :: Int }

hypothesisPrior :: MonadInfer m => Text -> m Hypothesis
hypothesisPrior query = do
  t <- categorical (V.fromList [1/3, 1/3, 1/3])
  let tritVal = [-1, 0, 1] !! t
>>>>>>> origin/main
  score <- liftIO $ geminiLikelihood query tritVal
  factor (Exp score)
  return $ H { content = query, trit = tritVal }

<<<<<<< HEAD
-- SMC over hypothesis space
=======
>>>>>>> origin/main
sampleHypotheses :: IO [Hypothesis]
sampleHypotheses = do
  let config = SMCConfig { resampler = systematic, numSteps = 10, numParticles = 1000 }
  smc config $ hypothesisPrior "Is this attractor chaotic?"

<<<<<<< HEAD

2. gay-monte-carlo → monad-bayes WeightedT

gay-monte-carlo performs GF(3)-colored sampling. Connect to monad-bayes for proper weight accumulation:

-- GF(3) conservation-aware sampler
gf3ColoredSample :: MonadSample m => m (Int, Int, Int)
gf3ColoredSample = do
  -- Sample triad with GF(3) conservation constraint
  t1 <- uniformDiscrete [-1, 0, 1]
  t2 <- uniformDiscrete [-1, 0, 1]
  let t3 = -(t1 + t2) `mod` 3  -- conservation: Σ = 0 mod 3
  -- Normalize to {-1, 0, 1}
  let t3' = if t3 > 1 then t3 - 3 else t3
  return (t1, t2, t3')

-- Weighted sampling over skill triads
skillTriadPosterior :: MonadInfer m => m (Skill, Skill, Skill)
skillTriadPosterior = do
  (t1, t2, t3) <- gf3ColoredSample
  s1 <- skillWithTrit t1
  s2 <- skillWithTrit t2
  s3 <- skillWithTrit t3
  -- Score by compositional coherence
  factor $ Exp (coherenceScore s1 s2 s3)
  return (s1, s2, s3)

3. lolita Physics Emulation → PMMH Parameter Inference

lolita (NeurIPS 2025, arxiv:2507.02608) emulates physics via latent diffusion. Use PMMH to infer latent parameters from observables:

-- PMMH: parameters (physical coefficients) × state (latent trajectory)
=======
### 2. lolita Physics Emulation -> PMMH Parameter Inference

Use PMMH to infer latent parameters from observables (NeurIPS 2025, arxiv:2507.02608):

```haskell
>>>>>>> origin/main
lolitaPMMH
  :: (MonadSample m, MonadInfer m)
  => [[Double]]  -- observed trajectory (downsampled)
  -> m ([Double], [[Double]])  -- (parameters, inferred latents)
lolitaPMMH obs = do
<<<<<<< HEAD
  -- Parameter model (TracedT layer)
  params <- do
    reynoldsNum <- gamma 2.0 0.5   -- Re ~ 2
    rayleighNum <- gamma 10.0 0.1  -- Ra ~ 10
    return [reynoldsNum, rayleighNum]

  -- State model (SequentialT(PopulationT) layer)
  latents <- forM (zip obs [0..]) $ \(obsT, t) -> do
    latent <- multivariate params t  -- lolita latent dynamics
    -- Score against observation
    mapM_ (\(o, l) -> score $ Normal l 0.1 `logProb` o) (zip obsT latent)
    return latent

  return (params, latents)

-- Run PMMH
runLolitaPMMH :: IO [([Double], [[Double]])]
runLolitaPMMH = do
  let mcmcCfg = MCMCConfig { numSteps = 1000, numBurnIn = 200 }
  let smcCfg  = SMCConfig  { numParticles = 500, numSteps = 50, resampler = systematic }
  pmmh mcmcCfg smcCfg (lolitaParamPrior) (lolitaStatePrior)

4. bayesian-breathing → RMSMC Sequential State Estimation

bayesian-breathing performs respiratory-rate Bayesian estimation. RMSMC adds MCMC rejuvenation:

-- RMSMC: breathing rate estimation with particle rejuvenation
breathingRMSMC :: MonadInfer m => [Double] -> m Double
breathingRMSMC observations = do
  -- State: breathing rate evolves over time
  let model = foldM (\rate obs -> do
        rate' <- normal rate 0.05  -- random walk
        factor $ Exp (normalLogProb obs (sin (2*pi*rate')) 0.1)
        return rate') 12.0 observations
  -- RMSMC: particle filter with MCMC moves
  rmsmc (RMSMCConfig { numParticles = 200, numMCMCSteps = 5 }) model

5. DensityT → Categorical Composition in ASI

DensityT provides the categorical structure that makes traces composable. Map to ASI's compositional framework:

-- DensityT trace as ASI skill composition log
=======
  params <- do
    reynoldsNum <- gamma 2.0 0.5
    rayleighNum <- gamma 10.0 0.1
    return [reynoldsNum, rayleighNum]
  latents <- forM (zip obs [0..]) $ \(obsT, t) -> do
    latent <- multivariate params t
    mapM_ (\(o, l) -> score $ Normal l 0.1 `logProb` o) (zip obsT latent)
    return latent
  return (params, latents)

3. bayesian-breathing -> RMSMC Sequential State Estimation

RMSMC adds MCMC rejuvenation to respiratory-rate estimation:

breathingRMSMC :: MonadInfer m => [Double] -> m Double
breathingRMSMC observations = do
  let model = foldM (\rate obs -> do
        rate' <- normal rate 0.05
        factor $ Exp (normalLogProb obs (sin (2*pi*rate')) 0.1)
        return rate') 12.0 observations
  rmsmc (RMSMCConfig { numParticles = 200, numMCMCSteps = 5 }) model

4. DensityT -> Categorical Composition

DensityT provides the categorical structure that makes traces composable:

>>>>>>> origin/main
traceToSkillPath :: Trace a -> [SkillInvocation]
traceToSkillPath trace = map toSkillInv (randomVariables trace)
  where
    toSkillInv rv = SkillInvocation
      { skillId  = hashToSkill (rvName rv)
<<<<<<< HEAD
      , trit     = sign (probDensity rv)  -- +1 if density high, -1 if low, 0 neutral
      , logProb  = probDensity rv
      }

-- Conservation check on trace: does the skill path form a valid GF(3) triad?
validateTraceGF3 :: Trace a -> Bool
validateTraceGF3 trace =
  let invocations = traceToSkillPath trace
      trits = map trit invocations
      grouped = chunksOf 3 trits
  in all (\[t1,t2,t3] -> (t1+t2+t3) `mod` 3 == 0) grouped

Concrete Wiring: Haskell ↔ ASI (via babashka/JSON-RPC)

Since asi is polyglot, use JSON-RPC to call monad-bayes from Clojure/babashka:

;; babashka: launch GHCi server, call monad-bayes SMC from Clojure
=======
      , logProb  = probDensity rv
      }

Concrete Wiring: Haskell <-> ASI (via babashka/JSON-RPC)

;; babashka: call monad-bayes SMC from Clojure
>>>>>>> origin/main
(require '[babashka.process :as p])

(defn run-smc [model-name n-particles observations]
  (let [proc (p/process {:in :pipe :out :string}
               "cabal" "run" "asi-inference-server")]
    (spit (:in proc)
          (str (pr-str {:method "smc"
                        :params {:model model-name
                                 :particles n-particles
                                 :observations observations}})
               "\n"))
    (-> proc :out deref (json/parse-string true))))

<<<<<<< HEAD Or use the monad-bayes Python bindings via pymc:

# Use PyMC as monad-bayes equivalent from Python
import pymc as pm
import numpy as np

def gf3_skill_model(observations, n_skills=1360):
    with pm.Model() as model:
        # Skill trit prior: symmetric over {-1, 0, +1}
        trit = pm.Categorical("trit", p=[1/3, 1/3, 1/3])
        trit_val = pm.Deterministic("trit_val", trit - 1)  # map to {-1, 0, 1}

        # Likelihood: GF(3) conservation
        conserved = pm.Deterministic("conserved", pm.math.eq((trit_val % 3), 0))
        pm.Potential("gf3_constraint", pm.math.log(conserved + 1e-10))

        # Sample via NUTS (MCMC)
        trace = pm.sample(1000, tune=500, cores=4)
    return trace

Notebook Coverage Gaps (from prior analysis)

Gaps identified in tweag/monad-bayes tutorial coverage:

| Topic | Covered | Missing | |-------|---------|---------| | SMC basics | ✅ | — | | MCMC (MH) | ✅ | — | | PMMH | partial | Full worked example with real data | | RMSMC | ❌ | Notebook: sequential state estimation | | DensityT internals | ❌ | Notebook: trace inspection + density computation | | GNN-based models | ❌ | Integration with torchdrug/deepchem | | Physics emulation | ❌ | PMMH for lolita latent parameter inference | | Information geometry | ❌ | Fisher-Rao metric on posterior manifold |

Related ASI Skills

  • abductive-monte-carlo — MCMC hypothesis sampling; monad-bayes SMC backend
  • abductive-repl — interactive Clojure REPL for abductive reasoning
  • gay-monte-carlo — GF(3)-colored sampling; WeightedT integration
  • bayesian-breathing — sequential respiratory state estimation via RMSMC
  • lolita / task#23 — physics emulation; PMMH for latent parameter inference
  • dysts — attractor dataset; SMC over chaotic system parameters
  • catcolab-causal-loop — causal loop diagrams; Bayesian network composition
  • monad-bayes (if exists) — direct Haskell skill wrapper
  • vertex-ai-protein-interleave — gnomAD variant-phenotype PMMH models =======

Notebook Coverage Gaps

| Topic | Covered | Missing | |-------|---------|---------| | SMC basics | yes | -- | | MCMC (MH) | yes | -- | | PMMH | partial | Full worked example with real data | | RMSMC | no | Notebook: sequential state estimation | | DensityT internals | no | Notebook: trace inspection + density computation | | Physics emulation | no | PMMH for lolita latent parameter inference | | Information geometry | no | Fisher-Rao metric on posterior manifold |

origin/main