forge-lcdl

Adopting forge-lcdl (consumers index)

Forge LCDL ships as forge_lcdl on PYTHONPATH or via pip. Integration is caller-driven: your service or script configures LlmEnvProfile (often via read_certificator_profile) and chooses run_task, TaskRunner, or…

Canonical overview: WHAT-IS-LCDL.md.

Dependency

Declared in forge-certificators (private index / path install); pattern for a sibling under Code/:

[project.dependencies]
forge-lcdl = ">=0.1.0"

Local editable checkout:

pip install -e ../forge-lcdl

Or path dep for experiments:

lcdl-local = ["forge-lcdl @ file:///../forge-lcdl"]

Use git+ssh for private remotes instead of file: where appropriate.

Sibling forge-lcdl-runtime (optional) is a separate repo/wheel (README § Sibling package); certificators also depend on it for DecisionPack-backed prose helpers.

As-built consumers (forge-certificators)

Primary workspace integration today is forge-certificators. Typical surfaces:

Consumer area Path (under forge-certificators/) LCDL surfaces
Chunk classify, exemplar extractor synthesis, probes, incremental diagnose src/forge_certificators/source_ingest/playwright_llm_page_discovery.py run_task for pw_chunk_classify, pw_extractor_synthesize_exemplar, pw_extractor_synthesize_probe, pw_incremental_diagnose; format_chat_error_message
Phase A page-kind routing (probe + run_task_fn) src/forge_certificators/source_ingest/core/phase_a.py run_json_contract_task, chat_once, parse_json_object_lenient, env phasea-json vs pw_page_kind_route via SOURCE_INGEST_PHASE_A_CONTRACT
Phase A fixture bundle HTTP scripts/pipeline/phase_a/run_fixture_bundle_http.py Imports run_task, read_certificator_profile; run_phase_a_scan_route_sync(..., run_task_fn=run_task)
Phase B fixture bundle HTTP scripts/pipeline/phase_b/run_fixture_bundle_http.py Same run_task injection pattern
Incremental extractor + prose MCQ augmentation src/forge_certificators/source_ingest/incremental_extractor_mc.py, src/forge_certificators/source_ingest/lcdl_prose_mcq_incremental.py run_task for exemplar synthesis; forge_lcdl_runtime PackExecutor / prose extract_mcq_items_from_prose when OEP_INCREMENTAL_LCDL_TEXT is on
Monte Carlo Phase A strategy search scripts/pipeline/experiments/monte/mc_phase_a_strategy_seek.py PYTHONPATH often includes ../forge-lcdl/src; from forge_lcdl import run_task; run_phase_a_scan_route_sync(..., run_task_fn=run_task)
PW LLM incremental CLI profiling src/forge_certificators/cli/pw_llm_incremental_mc.py read_certificator_profile (LCDL env alignment)
Phase A/B HTTP fixtures + monolithic discover (MCP browser) scripts/pipeline/phase_a/run_fixture_bundle_http.py, phase_b/run_fixture_bundle_http.py, scripts/pipeline/legacy/playwright_llm_discover_extract.py --browser-backend mcp, SOURCE_INGEST_BROWSER_BACKEND; pip install 'forge-lcdl[mcp]' + Node/npx; see MCP-CLIENT.md and certificators PLAYWRIGHT_LLM_CHUNKED_DISCOVERY.md

Operator acknowledgement: discovery flows require the source_ingest CLI flag --allow-lcdl (see source_ingest/cli.py).

High-level client (RAG-ready)

LcdlClient wraps ExecutionEngine: same tasks as run_task, plus optional retriever, policy-driven RAG, verification (including rag.citations), and graph execution. See docs/CLIENT-API.md.

from forge_lcdl import ExecutionPolicy, LcdlClient, read_certificator_profile

client = LcdlClient.from_env(policy=ExecutionPolicy(rag="auto", verification="schema"))
# Optional: retriever=KeywordContextRetriever(repo_path)
result = client.execute("llm_boolean_gate", {"facts": {"x": 1}, "question": "Is x positive?"})

Low-level run_task remains unchanged for existing callers.

Older notes on incremental migration (generic helpers answer/llm.py, workbench taxonomy scripts under forge-composer-workbench/…) remain valid where not yet ported. Mechanics routing narrative: PLAYWRIGHT-DISCOVERY.md, PAGE-MECHANICS.md.

Other workspace repos (forge-composer, forge-lenses under Code/) do not import forge_lcdl yet; describe integrations only once code depends on forge_lcdl.

Private repository

Keep forge-lcdl remotes private; do not commit API keys or live gateway URLs into Markdown or tests.