Skip to content

testing

Testing strategy for the project.


Scope: Individual functions and classes

Location: */tests/unit/

Tools: Jest/Vitest

Focus:

  • CRDT state manager
  • Permission checking
  • Audit logger

Scope: Component interactions

Location: */tests/integration/

Tools: Jest + Docker

Focus:

  • Sync daemon ↔ Yjs server
  • Plugin ↔ state files
  • Orchestrator ↔ IdP

Scope: Full user flows

Location: tests/e2e/

Tools: Playwright (for web UI)

Focus:

  • Two users editing simultaneously
  • Vault access control
  • Session persistence

ScenarioExpected
User A edits, User B opensB sees A’s changes
Both edit same paragraphMerged content
Both edit different sectionsBoth changes preserved
Large file syncCompletes in < 5s
Binary attachmentSynced via MinIO
ScenarioExpected
User with accessVault mounts
User without accessVault not visible
User group changesAccess reflects on next session
Invalid tokenAccess denied
ScenarioExpected
User opens fileOthers see presence
User closes filePresence clears
User moves cursorPosition updates

Obsidian can’t be automated headlessly. Plugin testing requires:

  1. Build plugin
  2. Install in test vault
  3. Open Obsidian
  4. Test manually
  1. Unit test core logic - State manager, identity, sync logic
  2. Mock Obsidian API - For integration tests
  3. Manual testing - For UI and file hooks
  4. Debug log file - For persistent debugging

Location: obsidian-plugin/test-vault/

Pre-configured with:

  • Sample markdown files
  • Plugin enabled
  • Debug logging on

.github/workflows/test.yml
on: [push, pull_request]
jobs:
test:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- name: Run sync-daemon tests
run: cd sync-daemon && npm test
- name: Run orchestrator tests
run: cd orchestrator && npm test
  • Lint (ESLint)
  • Type check (tsc)
  • Unit tests

VaultPurpose
demoBasic testing
large-vaultPerformance testing (1000+ files)
conflict-testConflict resolution scenarios
UserGroupsPurpose
aliceengineeringStandard user
bobengineering, devopsMulti-group
carolfinanceDifferent department
adminallAdmin testing

  • Sync latency (target: < 5s)
  • VNC latency (target: < 200ms)
  • File open time (target: < 1s)
  • Memory usage (target: < 500MB/user)
  • k6 for load testing
  • Grafana for monitoring

For testing CRDT sync with concurrent edits, we run multiple Obsidian instances sharing the same filesystem.

Compose file: docker-compose.test.yml

Terminal window
# Start multi-instance test environment
docker-compose -f docker-compose.test.yml up -d
# Alice: https://localhost:3101
# Bob: https://localhost:3201

Three approaches (layered by complexity):

ApproachWhat It TestsInfrastructure
A: Docker-OnlyOur CRDT sync, presence, mergeTwo containers + Yjs server
B: Docker + TailscaleAbove + Obsidian Sync device identityAdd Tailscale sidecars + unique machine IDs
C: Separate HostsTrue multi-device over WireGuard meshMultiple machines on tailnet

What makes instances “different”:

  • Plugin client ID: auto-generated UUID v4 per install (identity.ts)
  • Sync daemon USER_ID: set via environment variable per container
  • WebSocket connections: separate per daemon process
  • For Obsidian Sync: unique /var/lib/dbus/machine-id + Tailscale identity

Shared vault: vaults/shared-test/ — bind-mounted into both containers. Per-instance config: Separate Docker volumes for /config (isolates .obsidian/).

See 2026-03-28-multi-instance-sync-testing for full analysis and Tailscale integration details.


  1. No headless Obsidian - Can’t automate plugin UI tests
  2. VNC testing - Requires visual verification
  3. CRDT determinism - Hard to test edge cases