Agent Skills: Crossmodal GF(3) Skill

GF(3) → {Tactile, Auditory, Haptic} universal bridge for accessible color perception

UncategorizedID: plurigrid/asi/crossmodal-gf3

Install this agent skill to your local

pnpm dlx add-skill https://github.com/plurigrid/asi/tree/HEAD/plugins/asi/skills/crossmodal-gf3

Skill Files

Browse the full folder contents for crossmodal-gf3.

Download Skill

Loading file tree…

plugins/asi/skills/crossmodal-gf3/SKILL.md

Skill Metadata

Name
crossmodal-gf3
Description
"GF(3) → {Tactile, Auditory, Haptic} universal bridge for accessible color perception"

Crossmodal GF(3) Skill

"Color is not inherently visual. Color is INFORMATION that can be rendered through any sensory modality."

The Universal Bridge

This skill treats GF(3) trits as modality-independent semantic units:

| GF(3) Trit | Visual | Tactile | Auditory | Haptic | |------------|--------|---------|----------|--------| | MINUS (−1) | Cool hues | Rough/Bumpy | Low pitch | Left/Down | | ERGODIC (0) | Neutral | Smooth | Mid pitch | Center | | PLUS (+1) | Warm hues | Ridged | High pitch | Right/Up |

Key Insight

Visual perception is ALSO a projection from GF(3) space.

The sighted user doesn't have "the real thing" — they have π_visual(GF3). The blind user has π_tactile(GF3), π_auditory(GF3), π_haptic(GF3).

All projections are isomorphic under GF(3) conservation:

π_visual(W) ≅ π_tactile(W) ≅ π_auditory(W) ≅ π_haptic(W)

Implementation Files

From Gay.jl:

  1. world_tactile_color.jl — Core tactile/auditory/haptic types
  2. world_accessible_tensor.jl — A ⊗ G ⊗ M ⊗ T tensor product
  3. world_accessible_interrupt_operad.jl — TOAD × Amp × Knight Tour accessibility

Tactile: 3×3 Braille Extension

Standard Braille is 2×3 (6 dots). We extend to 3×3 (9 dots):

┌───┬───┬───┐
│ 1 │ 2 │ 3 │  ← Hue sector (warm/neutral/cool)
├───┼───┼───┤
│ 4 │ 5 │ 6 │  ← Saturation level
├───┼───┼───┤
│ 7 │ 8 │ 9 │  ← Lightness level
└───┴───┴───┘

Dot positions encode trits:

  • Left dot: MINUS (−1)
  • Center dot: ERGODIC (0)
  • Right dot: PLUS (+1)

Example: Warm, Vivid, Light Color

○ ○ ⬤   ← Hue: PLUS (warm)
○ ○ ⬤   ← Saturation: PLUS (vivid)
○ ○ ⬤   ← Lightness: PLUS (light)

Compact: +++

Auditory: 3-Tone Chords

Each color becomes a 3-frequency chord:

const BASE_FREQ = 440.0  # A4

function trit_to_freq_ratio(t::Int)::Float64
    t == -1 && return 0.84   # Minor third
    t == 0 && return 1.0     # Unison
    t == 1 && return 1.26    # Major third
end

function color_to_chord(hue_trit, sat_trit, light_trit)
    (
        BASE_FREQ * trit_to_freq_ratio(hue_trit),
        BASE_FREQ * 1.5 * trit_to_freq_ratio(sat_trit),   # Fifth
        BASE_FREQ * 2.0 * trit_to_freq_ratio(light_trit)  # Octave
    )
end

Sox Sonification

# Play a single color as chord
play -n synth 0.25 sine 440.0 : synth 0.25 sine 660.0 : synth 0.25 sine 880.0 remix - fade 0 0.25 0.05

Haptic: 3D Position + Vibration

struct HapticPosition
    x::Float64  # Hue: -1 (left) to +1 (right)
    y::Float64  # Saturation: -1 (back) to +1 (front)
    z::Float64  # Lightness: -1 (down) to +1 (up)
    vibration_pattern::Symbol  # :pulse, :smooth, :buzz
    intensity::Float64  # 0.0 to 1.0
end

Möbius Invertibility

Path navigability analysis using μ(path_length):

| μ ≠ 0 | GEODESIC | Smooth bidirectional tactile navigation | | μ = 0 | TANGLED | Add branching markers at squared prime positions |

I-Thou Framework

  • Geodesic paths: Treat blind user as capable of autonomous navigation (I-Thou)
  • Tangled paths: Restructure the path, not the user (I-Thou), vs "accommodate" (I-It)

μ(3) = -1: Action-Perception Duality

The Möbius value μ(3) = -1 creates:

Action:     {−, ○, +}   (visual domain)
Perception: {+, ○, −}   (tactile domain, Möbius-inverted)

Double inversion returns to original: μ ∘ μ = id.

GF(3) Triadic Integration

| Component | Trit | Role | |-----------|------|------| | catsharp-sonification | 0 | Auditory output | | crossmodal-gf3 | 0 | Universal bridge | | sense | 0 | Content extraction |

Conservation: 0 + 0 + 0 = 0 ✓ (all ERGODIC coordinators)

For generators/validators:

elevenlabs-acset (-1) ⊗ crossmodal-gf3 (0) ⊗ gesture-hypergestures (+1) = 0 ✓

Commands

# Generate accessible color palette (Julia)
julia -e 'using Gay; Gay.gay_seed!(137508); colors = [Gay.next_color() for _ in 1:6]'

# Sonify via catsharp-sonification
bb -e '(sonify-palette ["#DD3C3C" "#3CDD6B" "#9A3CDD"])'

# Extract content accessibly
just sense-extract reference/videos/lecture.mkv

Related Skills

  • catsharp-sonification — Hue → Pitch Class → Waveform
  • sense — Video → Subtitle + Diagram + Skill Index
  • buberian-relations — I-Thou / I-It / We formalization
  • gesture-hypergestures — Continuous curves for topology teaching
  • moebius-inversion — Path invertibility analysis
  • gay-julia — Wide-gamut color with SplitMix64

Theorem

Accessible Worlds Theorem:

For any world W with visual representation V:

π_visual(W) ≅ π_tactile(W) ≅ π_auditory(W) ≅ π_haptic(W)

where all projections π preserve GF(3) conservation:

∀ modality m: Σ(trits(π_m(W))) ≡ 0 (mod 3)

"The most unlike skills are the most essential - they bridge what others cannot reach."