markdown · 3268 bytes Raw Blame History

Template gallery

The fastest path from "I want to try dlm" to a trained adapter is dlm init --template <name>. Each template is a curated .dlm file with a sensible base model, body content to train on, and sample prompts to exercise after training.

$ dlm templates list
changelog                 Changelog entry generator              (smollm2-360m)
coding-tutor              Coding tutor (Python, curated)         (qwen2.5-coder-1.5b)
domain-kb                 Domain knowledge base                  (qwen2.5-3b)
meeting-notes-summarizer  Meeting notes → decision log           (qwen2.5-3b)
personal-assistant        Personal assistant                     (qwen2.5-1.5b)
regex-buddy               Regex explainer                        (qwen2.5-coder-1.5b)
shell-one-liner           Shell one-liner helper                 (qwen2.5-coder-1.5b)
writing-partner           Writing partner (stylistic continuation) (llama-3.2-3b)

Pass --json for machine-readable output (every field of the template's meta.yaml including domain_tags, expected_steps, expected_duration, and sample_prompts).

Creating a new .dlm from a template

$ dlm init mytutor.dlm --template coding-tutor
init: wrote mytutor.dlm from template coding-tutor (Coding tutor
(Python, curated)) — base qwen2.5-coder-1.5b.

$ dlm train mytutor.dlm
$ dlm prompt mytutor.dlm "What are Python decorators?"

The template's recommended_base is adopted automatically. If you pass --base alongside --template, the template wins (and you get a note); the bundled body was authored against its recommended base, so swapping bases mid-body is an advanced move — edit the frontmatter yourself after init if that's what you want.

What's in a template

Each template is a pair of files:

coding-tutor.dlm        # the `.dlm` body: frontmatter + sections
coding-tutor.meta.yaml  # metadata — name, title, tags, recommended_base, summary

The meta.yaml schema:

name: coding-tutor               # filename stem, required to match
title: Coding tutor (Python, curated)
domain_tags: [code, python, tutor]
recommended_base: qwen2.5-coder-1.5b
expected_steps: 800              # rough step count at defaults
expected_duration:               # hardware-tier → wall-clock estimate
  cuda-sm80+: "~5 min"
  mps: "~15 min"
  cpu: "~2 hr"
summary: |
  A compact Python-focused Q&A tutor.
sample_prompts:
  - "What are Python decorators?"

Invalid metadata (unknown keys, missing required fields, sidecar absent) drops the template from the listing with a logged warning — never silently served.

The bundled gallery ships inside the dlm package at src/dlm/templates/gallery/. importlib.resources.files resolves the path regardless of whether you're running from a source checkout or a pip install-ed wheel.

The sprint spec calls for dlm templates list --refresh to pull from github.com/<org>/dlm-templates with signed-tag verification against a pinned public key. That upstream repo and signing key are pending; until they land, --refresh emits a clear warning and falls back to the bundled gallery. All usage here works offline today.

View source
1 # Template gallery
2
3 The fastest path from "I want to try `dlm`" to a trained adapter is
4 `dlm init --template <name>`. Each template is a curated `.dlm` file
5 with a sensible base model, body content to train on, and sample
6 prompts to exercise after training.
7
8 ## Listing the gallery
9
10 ```bash
11 $ dlm templates list
12 changelog Changelog entry generator (smollm2-360m)
13 coding-tutor Coding tutor (Python, curated) (qwen2.5-coder-1.5b)
14 domain-kb Domain knowledge base (qwen2.5-3b)
15 meeting-notes-summarizer Meeting notes → decision log (qwen2.5-3b)
16 personal-assistant Personal assistant (qwen2.5-1.5b)
17 regex-buddy Regex explainer (qwen2.5-coder-1.5b)
18 shell-one-liner Shell one-liner helper (qwen2.5-coder-1.5b)
19 writing-partner Writing partner (stylistic continuation) (llama-3.2-3b)
20 ```
21
22 Pass `--json` for machine-readable output (every field of the template's
23 `meta.yaml` including `domain_tags`, `expected_steps`,
24 `expected_duration`, and `sample_prompts`).
25
26 ## Creating a new `.dlm` from a template
27
28 ```bash
29 $ dlm init mytutor.dlm --template coding-tutor
30 init: wrote mytutor.dlm from template coding-tutor (Coding tutor
31 (Python, curated)) — base qwen2.5-coder-1.5b.
32
33 $ dlm train mytutor.dlm
34 $ dlm prompt mytutor.dlm "What are Python decorators?"
35 ```
36
37 The template's `recommended_base` is adopted automatically. If you pass
38 `--base` alongside `--template`, the template wins (and you get a
39 note); the bundled body was authored against its recommended base, so
40 swapping bases mid-body is an advanced move — edit the frontmatter
41 yourself after `init` if that's what you want.
42
43 ## What's in a template
44
45 Each template is a pair of files:
46
47 ```
48 coding-tutor.dlm # the `.dlm` body: frontmatter + sections
49 coding-tutor.meta.yaml # metadata — name, title, tags, recommended_base, summary
50 ```
51
52 The `meta.yaml` schema:
53
54 ```yaml
55 name: coding-tutor # filename stem, required to match
56 title: Coding tutor (Python, curated)
57 domain_tags: [code, python, tutor]
58 recommended_base: qwen2.5-coder-1.5b
59 expected_steps: 800 # rough step count at defaults
60 expected_duration: # hardware-tier → wall-clock estimate
61 cuda-sm80+: "~5 min"
62 mps: "~15 min"
63 cpu: "~2 hr"
64 summary: |
65 A compact Python-focused Q&A tutor.
66 sample_prompts:
67 - "What are Python decorators?"
68 ```
69
70 Invalid metadata (unknown keys, missing required fields, sidecar
71 absent) drops the template from the listing with a logged warning —
72 never silently served.
73
74 ## Where the gallery lives
75
76 The bundled gallery ships inside the `dlm` package at
77 `src/dlm/templates/gallery/`. `importlib.resources.files` resolves the
78 path regardless of whether you're running from a source checkout or a
79 `pip install`-ed wheel.
80
81 ## Remote gallery — deferred
82
83 The sprint spec calls for `dlm templates list --refresh` to pull from
84 `github.com/<org>/dlm-templates` with signed-tag verification against a
85 pinned public key. That upstream repo and signing key are pending;
86 until they land, `--refresh` emits a clear warning and falls back to
87 the bundled gallery. All usage here works offline today.