forked from mirror/oddmu
Compare commits
62 Commits
| Author | SHA1 | Date | |
|---|---|---|---|
| 0696603f69 | |||
| 987d859431 | |||
| 36d40dff50 | |||
| 10d055c117 | |||
| 01dad0b932 | |||
| f5be60d026 | |||
|
|
f0d814b8f3 | ||
|
|
41f6d9e48e | ||
|
|
7beaf3e375 | ||
|
|
8f277714d7 | ||
|
|
11eace1b3c | ||
|
|
ad778a3068 | ||
|
|
6ec69d27eb | ||
|
|
0a093182e9 | ||
|
|
9952f2363b | ||
|
|
4d14517668 | ||
|
|
0051a5ca66 | ||
|
|
2436cc1114 | ||
|
|
96993b794a | ||
|
|
991260b78c | ||
|
|
751b9fe63d | ||
|
|
34afc151a4 | ||
|
|
efbea969fa | ||
|
|
e12ab8594c | ||
|
|
233b9817b5 | ||
|
|
ecfee31cbd | ||
|
|
e86de1beb8 | ||
|
|
e9666a5ec5 | ||
|
|
f71a0e9780 | ||
|
|
aca1d82fe0 | ||
|
|
290ad16e09 | ||
|
|
ad1732f57f | ||
|
|
67120af7cc | ||
|
|
c186253a25 | ||
|
|
d0d4545f74 | ||
|
|
c32f087af4 | ||
|
|
f657ac60a3 | ||
|
|
16ae6cc143 | ||
| 19a1ea6efe | |||
|
|
e041c5ecae | ||
|
|
3078d63890 | ||
|
|
143ecb8a0a | ||
|
|
d66aa03a2d | ||
|
|
64954ddf5d | ||
|
|
a1d6ebfdff | ||
|
|
db3a3f5009 | ||
|
|
ece9649e3d | ||
|
|
23074cdd58 | ||
|
|
06c07209a2 | ||
|
|
7b2a835729 | ||
|
|
d0fe534f8e | ||
|
|
ac7de17a87 | ||
|
|
84e6a757b2 | ||
|
|
2dfb2afbf5 | ||
|
|
2092b5777c | ||
|
|
f635cb738a | ||
|
|
da398a3315 | ||
|
|
7315abd5bb | ||
|
|
b39901b244 | ||
|
|
bb4843c2f4 | ||
|
|
816c981200 | ||
|
|
89d550a1a4 |
2
LICENSE
2
LICENSE
@@ -1,4 +1,4 @@
|
||||
This software is Copyright (c) 2015–2024 by Alex Schroeder.
|
||||
This software is Copyright (c) 2015–2026 by Alex Schroeder.
|
||||
|
||||
This is free software, licensed under:
|
||||
|
||||
|
||||
34
Makefile
34
Makefile
@@ -1,7 +1,8 @@
|
||||
SHELL=/bin/bash
|
||||
SHELL=/usr/bin/env bash
|
||||
PREFIX=${HOME}/.local
|
||||
BINARIES=oddmu-linux-amd64.tar.gz oddmu-linux-arm64.tar.gz oddmu-darwin-amd64.tar.gz oddmu-darwin-arm64.tar.gz oddmu-windows-amd64.tar.gz
|
||||
|
||||
.PHONY: help build test run upload docs install priv
|
||||
.PHONY: help build test run upload docs install priv clean dist dist-upload
|
||||
|
||||
help:
|
||||
@echo Help for Oddmu
|
||||
@@ -10,6 +11,10 @@ help:
|
||||
@echo " runs program, offline"
|
||||
@echo make test
|
||||
@echo " runs the tests without log output"
|
||||
@echo make check
|
||||
@echo " checks the code with golint and gocritic"
|
||||
@echo make fix
|
||||
@echo " fixes formatting issues with goimports instead of go fmt"
|
||||
@echo make docs
|
||||
@echo " create man pages from text files"
|
||||
@echo make build
|
||||
@@ -34,6 +39,17 @@ test:
|
||||
rm -rf testdata/*
|
||||
go test -shuffle on .
|
||||
|
||||
# go install golang.org/x/lint/golint@latest
|
||||
# go install github.com/go-critic/go-critic/cmd/go-critic@latest
|
||||
check:
|
||||
golint
|
||||
go-critic check
|
||||
|
||||
# go install golang.org/x/tools/cmd/goimports@latest
|
||||
fix:
|
||||
goimports -w *.go
|
||||
|
||||
|
||||
run:
|
||||
go run .
|
||||
|
||||
@@ -50,10 +66,13 @@ install:
|
||||
install -D -t ${PREFIX}/bin oddmu
|
||||
|
||||
clean:
|
||||
rm --force oddmu oddmu.exe oddmu-{linux,darwin,windows}-{amd64,arm64}{,.tar.gz}
|
||||
rm -f oddmu oddmu.exe oddmu-{linux,darwin,windows}-{amd64,arm64}{,.tar.gz}
|
||||
cd man && make clean
|
||||
|
||||
dist: oddmu-linux-amd64.tar.gz oddmu-linux-arm64.tar.gz oddmu-darwin-amd64.tar.gz oddmu-windows-amd64.tar.gz
|
||||
dist-upload: $(BINARIES)
|
||||
rsync -ai $^ sibirocobombus:alexschroeder.ch/wiki/oddmu/
|
||||
|
||||
dist: $(BINARIES)
|
||||
|
||||
oddmu-linux-amd64: *.go
|
||||
GOOS=linux GOARCH=amd64 go build -o $@
|
||||
@@ -62,6 +81,9 @@ oddmu-linux-arm64: *.go
|
||||
env GOOS=linux GOARCH=arm64 GOARM=5 go build -o $@
|
||||
|
||||
oddmu-darwin-amd64: *.go
|
||||
GOOS=darwin GOARCH=amd64 go build -o $@
|
||||
|
||||
oddmu-darwin-arm64: *.go
|
||||
GOOS=darwin GOARCH=arm64 go build -o $@
|
||||
|
||||
oddmu.exe: *.go
|
||||
@@ -73,8 +95,8 @@ oddmu-windows-amd64.tar.gz: oddmu.exe
|
||||
$< *.md man/*.[157].{html,md} themes/
|
||||
|
||||
%.tar.gz: %
|
||||
tar --create --file $@ --transform='s/^$</oddmu/' --transform='s/^/oddmu\//' --exclude='*~' \
|
||||
$< *.html Makefile *.socket *.service *.md man/Makefile man/*.1 man/*.5 man/*.7 themes/
|
||||
tar --create --gzip --file $@ --transform='s/^$</oddmu/' --transform='s/^/oddmu\//' --exclude='*~' \
|
||||
$< *.html Makefile *.socket *.service *.md man/Makefile man/*.[157] themes/
|
||||
|
||||
priv:
|
||||
sudo setcap 'cap_net_bind_service=+ep' oddmu
|
||||
|
||||
33
README.md
33
README.md
@@ -8,20 +8,18 @@ with Markdown files, turning them into HTML files. HTML templates
|
||||
allow the customisation of headers, footers and styling. There are no
|
||||
plugins.
|
||||
|
||||
Oddμ is well suited as a self-hosted, single-user web application,
|
||||
when there is no need for collaboration on the site itself. Links and
|
||||
email connect you to the rest of the net. The wiki can be public or
|
||||
private.
|
||||
Oddμ is well suited as a self-hosted, single-user web application.
|
||||
Edit the pages from your phone or laptop, while you're on the move.
|
||||
|
||||
If the site is public, use a regular web server as a proxy to make
|
||||
people log in before making changes. As there is no version history,
|
||||
it is not possible to undo vandalism and spam. Only grant write-access
|
||||
it is not easy to undo vandalism and spam. Only grant write-access
|
||||
to people you trust.
|
||||
|
||||
If the site is private, running on a local machine and unreachable
|
||||
from the Internet, no such precautions are necessary.
|
||||
|
||||
Oddμ is well suited as a secondary medium for a close-knit group:
|
||||
This makes Oddμ well suited as a secondary medium for a close-knit group:
|
||||
collaboration and conversation happens elsewhere, in chat, on social
|
||||
media. The wiki serves as the text repository that results from these
|
||||
discussions.
|
||||
@@ -43,7 +41,7 @@ Other files can be uploaded and images (ending in `.jpg`, `.jpeg`,
|
||||
|
||||
## Documentation
|
||||
|
||||
This project uses man(1) pages. They are generated from text files
|
||||
This project uses man pages. They are generated from text files
|
||||
using [scdoc](https://git.sr.ht/~sircmpwn/scdoc). These are the files
|
||||
available:
|
||||
|
||||
@@ -67,6 +65,11 @@ changes.
|
||||
This man page documents the "version" subcommand which you can use to
|
||||
get the installed Oddmu version.
|
||||
|
||||
[oddmu-man(1)](https://alexschroeder.ch/view/oddmu/oddmu-man.1): Oddmu
|
||||
comes with a "man" subcommand to print the manual pages. This man page
|
||||
documents the subcommand, but I guess if you can read the man page,
|
||||
you don't need the "man" subcommand.
|
||||
|
||||
Working locally:
|
||||
|
||||
[oddmu-links(1)](https://alexschroeder.ch/view/oddmu/oddmu-links.1):
|
||||
@@ -108,6 +111,14 @@ Static site generator:
|
||||
This man page documents the "html" subcommand to generate HTML from
|
||||
Markdown pages from the command line.
|
||||
|
||||
[oddmu-feed(1)](https://alexschroeder.ch/view/oddmu/oddmu-feed.1):
|
||||
This man page documents the "feed" subcommand to generate a feed from
|
||||
Markdown pages from the command line.
|
||||
|
||||
[oddmu-sitemap(1)](https://alexschroeder.ch/view/oddmu/oddmu-sitemap.1):
|
||||
This man page documents the "sitemap" subcommand to generate the
|
||||
static sitemap from the command line.
|
||||
|
||||
[oddmu-static(1)](https://alexschroeder.ch/view/oddmu/oddmu-static.1):
|
||||
This man page documents the "static" subcommand to generate an entire
|
||||
static website from the command line, avoiding the need to run Oddmu
|
||||
@@ -214,6 +225,13 @@ into `$HOME/.local/share/man/`.
|
||||
make install
|
||||
```
|
||||
|
||||
This installs `oddmu` into `/usr/local/bin` and the manual pages into
|
||||
`/usr/local/share/man`:
|
||||
|
||||
```sh
|
||||
sudo make install PREFIX=/usr/local
|
||||
```
|
||||
|
||||
Here's an example using [GNU Stow](https://www.gnu.org/software/stow/)
|
||||
to install it into `/usr/local/stow` in a way that allows you to
|
||||
uninstall it later:
|
||||
@@ -253,6 +271,7 @@ high-level introduction to the various source files.
|
||||
- `preview.go` implements the `/preview` handler
|
||||
- `score.go` implements the page scoring when showing search results
|
||||
- `search.go` implements the `/search` handler
|
||||
- `sitemap.go` implements the `/sitemap` handler
|
||||
- `snippets.go` implements the page summaries for search results
|
||||
- `templates.go` implements template loading and reloading
|
||||
- `tokenizer.go` implements the various tokenizers used
|
||||
|
||||
16
RELEASE
16
RELEASE
@@ -1,22 +1,22 @@
|
||||
When preparing a new release
|
||||
----------------------------
|
||||
|
||||
1. Run tests
|
||||
1. run tests
|
||||
|
||||
2. Update man/oddmu-releases.7.txt
|
||||
2. update man/oddmu-releases.7.txt
|
||||
- add missing items
|
||||
- change "(unreleased)"
|
||||
|
||||
3. make docs
|
||||
3. check copyright year in LICENSE
|
||||
|
||||
4. Make sure all files are checked in
|
||||
4. make docs
|
||||
|
||||
5. Tag the release and push the tag to all remotes
|
||||
5. make sure all files are checked in
|
||||
|
||||
6. cd man && make upload
|
||||
6. tag the release and push the tag to the remote
|
||||
|
||||
7. make dist
|
||||
|
||||
8. create a new release at https://github.com/kensanata/oddmu/releases
|
||||
8. make dist-upload
|
||||
|
||||
9. upload the four .tar.gz binaries to the GitHub release
|
||||
9. cd man && make upload
|
||||
|
||||
16
accounts.go
16
accounts.go
@@ -2,13 +2,14 @@ package main
|
||||
|
||||
import (
|
||||
"encoding/json"
|
||||
"github.com/gomarkdown/markdown/ast"
|
||||
"github.com/gomarkdown/markdown/parser"
|
||||
"io"
|
||||
"log"
|
||||
"net/http"
|
||||
"os"
|
||||
"sync"
|
||||
|
||||
"github.com/gomarkdown/markdown/ast"
|
||||
"github.com/gomarkdown/markdown/parser"
|
||||
)
|
||||
|
||||
// useWebfinger indicates whether Oddmu looks up the profile pages of fediverse accounts. To enable this, set the
|
||||
@@ -38,7 +39,7 @@ func init() {
|
||||
|
||||
// accountLink links a social media accountLink like @accountLink@domain to a profile page like https://domain/user/accountLink. Any
|
||||
// accountLink seen for the first time uses a best guess profile URI. It is also looked up using webfinger, in parallel. See
|
||||
// lookUpAccountUri. If the lookup succeeds, the best guess is replaced with the new URI so on subsequent requests, the
|
||||
// lookUpAccountURI. If the lookup succeeds, the best guess is replaced with the new URI so on subsequent requests, the
|
||||
// URI is correct.
|
||||
func accountLink(p *parser.Parser, data []byte, offset int) (int, ast.Node) {
|
||||
data = data[offset:]
|
||||
@@ -56,9 +57,8 @@ func accountLink(p *parser.Parser, data []byte, offset int) (int, ast.Node) {
|
||||
if d != 0 {
|
||||
// more than one @ is invalid
|
||||
return 0, nil
|
||||
} else {
|
||||
d = i + 1 // skip @ of domain
|
||||
}
|
||||
d = i + 1 // skip @ of domain
|
||||
}
|
||||
i++
|
||||
}
|
||||
@@ -79,7 +79,7 @@ func accountLink(p *parser.Parser, data []byte, offset int) (int, ast.Node) {
|
||||
log.Printf("Looking up %s\n", account)
|
||||
uri = "https://" + string(domain) + "/users/" + string(user[1:])
|
||||
accounts.uris[string(account)] = uri // prevent more lookings
|
||||
go lookUpAccountUri(string(account), string(domain))
|
||||
go lookUpAccountURI(string(account), string(domain))
|
||||
}
|
||||
link := &ast.Link{
|
||||
AdditionalAttributes: []string{`class="account"`},
|
||||
@@ -90,9 +90,9 @@ func accountLink(p *parser.Parser, data []byte, offset int) (int, ast.Node) {
|
||||
return i, link
|
||||
}
|
||||
|
||||
// lookUpAccountUri is called for accounts that haven't been seen before. It calls webfinger and parses the JSON. If
|
||||
// lookUpAccountURI is called for accounts that haven't been seen before. It calls webfinger and parses the JSON. If
|
||||
// possible, it extracts the link to the profile page and replaces the entry in accounts.
|
||||
func lookUpAccountUri(account, domain string) {
|
||||
func lookUpAccountURI(account, domain string) {
|
||||
uri := "https://" + domain + "/.well-known/webfinger"
|
||||
resp, err := http.Get(uri + "?resource=acct:" + account)
|
||||
if err != nil {
|
||||
|
||||
@@ -1,8 +1,9 @@
|
||||
package main
|
||||
|
||||
import (
|
||||
"github.com/stretchr/testify/assert"
|
||||
"testing"
|
||||
|
||||
"github.com/stretchr/testify/assert"
|
||||
)
|
||||
|
||||
func TestWebfingerParsing(t *testing.T) {
|
||||
|
||||
@@ -48,15 +48,17 @@ func appendHandler(w http.ResponseWriter, r *http.Request, name string) {
|
||||
return
|
||||
}
|
||||
}
|
||||
http.Redirect(w, r, "/view/" + nameEscape(name), http.StatusFound)
|
||||
http.Redirect(w, r, "/view/"+nameEscape(name), http.StatusFound)
|
||||
}
|
||||
|
||||
func (p *Page) append(body []byte) {
|
||||
// ensure an empty line at the end
|
||||
if bytes.HasSuffix(p.Body, []byte("\n\n")) {
|
||||
} else if bytes.HasSuffix(p.Body, []byte("\n")) {
|
||||
switch {
|
||||
case bytes.HasSuffix(p.Body, []byte("\n\n")):
|
||||
// two newlines, nothing to add
|
||||
case bytes.HasSuffix(p.Body, []byte("\n")):
|
||||
p.Body = append(p.Body, '\n')
|
||||
} else {
|
||||
default:
|
||||
p.Body = append(p.Body, '\n', '\n')
|
||||
}
|
||||
p.Body = append(p.Body, body...)
|
||||
|
||||
@@ -1,13 +1,14 @@
|
||||
package main
|
||||
|
||||
import (
|
||||
"github.com/stretchr/testify/assert"
|
||||
"net/http"
|
||||
"net/url"
|
||||
"os"
|
||||
"regexp"
|
||||
"testing"
|
||||
"time"
|
||||
|
||||
"github.com/stretchr/testify/assert"
|
||||
)
|
||||
|
||||
func TestEmptyLineAdd(t *testing.T) {
|
||||
|
||||
@@ -2,11 +2,12 @@ package main
|
||||
|
||||
import (
|
||||
"archive/zip"
|
||||
"github.com/stretchr/testify/assert"
|
||||
"net/http"
|
||||
"os"
|
||||
"strings"
|
||||
"testing"
|
||||
|
||||
"github.com/stretchr/testify/assert"
|
||||
)
|
||||
|
||||
func TestArchive(t *testing.T) {
|
||||
|
||||
14
changes.go
14
changes.go
@@ -38,7 +38,7 @@ func (p *Page) notify() error {
|
||||
return err
|
||||
}
|
||||
}
|
||||
p.renderHtml() // to set hashtags
|
||||
p.renderHTML() // to set hashtags
|
||||
for _, hashtag := range p.Hashtags {
|
||||
err := addLink(path.Join(dir, hashtag), false, link, re)
|
||||
if err != nil {
|
||||
@@ -98,12 +98,13 @@ func addLinkWithDate(name, link string, re *regexp.Regexp) error {
|
||||
if loc[0] >= 14 {
|
||||
re := regexp.MustCompile(`(?m)^## (\d\d\d\d-\d\d-\d\d)\n`)
|
||||
m := re.Find(p.Body[loc[0]-14 : loc[0]])
|
||||
if m == nil {
|
||||
switch {
|
||||
case m == nil:
|
||||
// not a date: insert date, don't move insertion point
|
||||
} else if string(p.Body[loc[0]-11:loc[0]-1]) == date {
|
||||
case string(p.Body[loc[0]-11:loc[0]-1]) == date:
|
||||
// if the date is our date, don't add it, don't move insertion point
|
||||
addDate = false
|
||||
} else {
|
||||
default:
|
||||
// if the date is not out date, move the insertion point
|
||||
loc[0] -= 14
|
||||
}
|
||||
@@ -148,10 +149,9 @@ func addLink(name string, mandatory bool, link string, re *regexp.Regexp) error
|
||||
if mandatory {
|
||||
p = &Page{Name: name, Body: []byte(link)}
|
||||
return p.save()
|
||||
} else {
|
||||
// Skip non-existing files: no error
|
||||
return nil
|
||||
}
|
||||
// Skip non-existing files: no error
|
||||
return nil
|
||||
}
|
||||
org := string(p.Body)
|
||||
addLinkToPage(p, link, re)
|
||||
|
||||
@@ -1,11 +1,12 @@
|
||||
package main
|
||||
|
||||
import (
|
||||
"github.com/stretchr/testify/assert"
|
||||
"os"
|
||||
"regexp"
|
||||
"testing"
|
||||
"time"
|
||||
|
||||
"github.com/stretchr/testify/assert"
|
||||
)
|
||||
|
||||
// Note TestEditSaveChanges and TestAddAppendChanges.
|
||||
@@ -17,7 +18,7 @@ func TestAddLinkToPageWithNoList(t *testing.T) {
|
||||
re := regexp.MustCompile(`(?m)^\* \[[^\]]+\]\(2025-08-08\)\n`)
|
||||
link := "* [2025-08-08](2025-08-08)\n"
|
||||
addLinkToPage(p, link, re)
|
||||
assert.Equal(t, title + "\n\n" + link, string(p.Body))
|
||||
assert.Equal(t, title+"\n\n"+link, string(p.Body))
|
||||
}
|
||||
|
||||
func TestAddLinkToPageWithOlderLink(t *testing.T) {
|
||||
@@ -28,7 +29,7 @@ func TestAddLinkToPageWithOlderLink(t *testing.T) {
|
||||
re := regexp.MustCompile(`(?m)^\* \[[^\]]+\]\(2025-08-10\)\n`)
|
||||
link := "* [2025-08-10](2025-08-10)\n"
|
||||
addLinkToPage(p, link, re)
|
||||
assert.Equal(t, title + "\n" + link + old, string(p.Body))
|
||||
assert.Equal(t, title+"\n"+link+old, string(p.Body))
|
||||
}
|
||||
|
||||
func TestAddLinkToPageBetweenToExistingLinks(t *testing.T) {
|
||||
@@ -39,7 +40,7 @@ func TestAddLinkToPageBetweenToExistingLinks(t *testing.T) {
|
||||
re := regexp.MustCompile(`(?m)^\* \[[^\]]+\]\(2025-08-09\)\n`)
|
||||
link := "* [2025-08-09](2025-08-09)\n"
|
||||
addLinkToPage(p, link, re)
|
||||
assert.Equal(t, title + new + link + old, string(p.Body))
|
||||
assert.Equal(t, title+new+link+old, string(p.Body))
|
||||
}
|
||||
|
||||
func TestAddLinkToPageBetweenToExistingLinks2(t *testing.T) {
|
||||
@@ -50,7 +51,7 @@ func TestAddLinkToPageBetweenToExistingLinks2(t *testing.T) {
|
||||
re := regexp.MustCompile(`(?m)^\* \[[^\]]+\]\(2025-08-08\)\n`)
|
||||
link := "* [2025-08-08](2025-08-08)\n"
|
||||
addLinkToPage(p, link, re)
|
||||
assert.Equal(t, title + new + link + old, string(p.Body))
|
||||
assert.Equal(t, title+new+link+old, string(p.Body))
|
||||
}
|
||||
|
||||
func TestAddLinkToPageAtTheEnd(t *testing.T) {
|
||||
@@ -61,7 +62,7 @@ func TestAddLinkToPageAtTheEnd(t *testing.T) {
|
||||
re := regexp.MustCompile(`(?m)^\* \[[^\]]+\]\(2025-08-07\)\n`)
|
||||
link := "* [2025-08-07](2025-08-07)\n"
|
||||
addLinkToPage(p, link, re)
|
||||
assert.Equal(t, title + new + old + link, string(p.Body))
|
||||
assert.Equal(t, title+new+old+link, string(p.Body))
|
||||
}
|
||||
|
||||
func TestChanges(t *testing.T) {
|
||||
@@ -119,9 +120,9 @@ func TestChangesWithList(t *testing.T) {
|
||||
p.notify()
|
||||
s, err := os.ReadFile("testdata/changes/changes.md")
|
||||
assert.NoError(t, err)
|
||||
new_line := "* [testdata/changes/alex](alex)\n"
|
||||
newLine := "* [testdata/changes/alex](alex)\n"
|
||||
// new line was added at the beginning of the list
|
||||
assert.Equal(t, intro+d+new_line+line, string(s))
|
||||
assert.Equal(t, intro+d+newLine+line, string(s))
|
||||
}
|
||||
|
||||
func TestChangesWithOldList(t *testing.T) {
|
||||
@@ -136,9 +137,9 @@ func TestChangesWithOldList(t *testing.T) {
|
||||
p.notify()
|
||||
s, err := os.ReadFile("testdata/changes/changes.md")
|
||||
assert.NoError(t, err)
|
||||
new_line := "* [testdata/changes/alex](alex)\n"
|
||||
newLine := "* [testdata/changes/alex](alex)\n"
|
||||
// new line was added at the beginning of the list
|
||||
assert.Equal(t, intro+d+new_line+"\n"+y+line, string(s))
|
||||
assert.Equal(t, intro+d+newLine+"\n"+y+line, string(s))
|
||||
}
|
||||
|
||||
func TestChangesWithOldDisappearingListAtTheEnd(t *testing.T) {
|
||||
@@ -153,9 +154,9 @@ func TestChangesWithOldDisappearingListAtTheEnd(t *testing.T) {
|
||||
p.notify()
|
||||
s, err := os.ReadFile("testdata/changes/changes.md")
|
||||
assert.NoError(t, err)
|
||||
new_line := "* [testdata/changes/alex](alex)\n"
|
||||
newLine := "* [testdata/changes/alex](alex)\n"
|
||||
// new line was added at the beginning of the list, with the new date, and the old date disappeared
|
||||
assert.Equal(t, intro+d+new_line, string(s))
|
||||
assert.Equal(t, intro+d+newLine, string(s))
|
||||
}
|
||||
|
||||
func TestChangesWithOldDisappearingListInTheMiddle(t *testing.T) {
|
||||
@@ -172,9 +173,9 @@ func TestChangesWithOldDisappearingListInTheMiddle(t *testing.T) {
|
||||
p.notify()
|
||||
s, err := os.ReadFile("testdata/changes/changes.md")
|
||||
assert.NoError(t, err)
|
||||
new_line := "* [testdata/changes/alex](alex)\n"
|
||||
newLine := "* [testdata/changes/alex](alex)\n"
|
||||
// new line was added at the beginning of the list, with the new date, and the old date disappeared
|
||||
assert.Equal(t, intro+d+new_line+"\n"+yy+other, string(s))
|
||||
assert.Equal(t, intro+d+newLine+"\n"+yy+other, string(s))
|
||||
}
|
||||
|
||||
func TestChangesWithListAtTheTop(t *testing.T) {
|
||||
@@ -187,9 +188,9 @@ func TestChangesWithListAtTheTop(t *testing.T) {
|
||||
p.notify()
|
||||
s, err := os.ReadFile("testdata/changes/changes.md")
|
||||
assert.NoError(t, err)
|
||||
new_line := "* [testdata/changes/alex](alex)\n"
|
||||
newLine := "* [testdata/changes/alex](alex)\n"
|
||||
// new line was added at the top, no error due to missing introduction
|
||||
assert.Equal(t, d+new_line+line, string(s))
|
||||
assert.Equal(t, d+newLine+line, string(s))
|
||||
}
|
||||
|
||||
func TestChangesWithNoList(t *testing.T) {
|
||||
@@ -202,9 +203,9 @@ func TestChangesWithNoList(t *testing.T) {
|
||||
p.notify()
|
||||
s, err := os.ReadFile("testdata/changes/changes.md")
|
||||
assert.NoError(t, err)
|
||||
new_line := "* [testdata/changes/alex](alex)\n"
|
||||
newLine := "* [testdata/changes/alex](alex)\n"
|
||||
// into is still there and a new list was started
|
||||
assert.Equal(t, intro+"\n\n"+d+new_line, string(s))
|
||||
assert.Equal(t, intro+"\n\n"+d+newLine, string(s))
|
||||
}
|
||||
|
||||
func TestChangesWithUpdate(t *testing.T) {
|
||||
@@ -219,9 +220,9 @@ func TestChangesWithUpdate(t *testing.T) {
|
||||
p.notify()
|
||||
s, err := os.ReadFile("testdata/changes/changes.md")
|
||||
assert.NoError(t, err)
|
||||
new_line := "* [testdata/changes/alex](alex)\n"
|
||||
newLine := "* [testdata/changes/alex](alex)\n"
|
||||
// the change was already listed, but now it moved up and has a new title
|
||||
assert.Equal(t, intro+d+new_line+other, string(s))
|
||||
assert.Equal(t, intro+d+newLine+other, string(s))
|
||||
}
|
||||
|
||||
func TestChangesWithNoChangeToTheOrder(t *testing.T) {
|
||||
@@ -236,9 +237,9 @@ func TestChangesWithNoChangeToTheOrder(t *testing.T) {
|
||||
p.notify()
|
||||
s, err := os.ReadFile("testdata/changes/changes.md")
|
||||
assert.NoError(t, err)
|
||||
new_line := "* [testdata/changes/alex](alex)\n"
|
||||
newLine := "* [testdata/changes/alex](alex)\n"
|
||||
// the change was already listed at the top, so just use the new title
|
||||
assert.Equal(t, intro+d+new_line+other, string(s))
|
||||
assert.Equal(t, intro+d+newLine+other, string(s))
|
||||
// since the file has changed, a backup was necessary
|
||||
assert.FileExists(t, "testdata/changes/changes.md~")
|
||||
}
|
||||
|
||||
7
diff.go
7
diff.go
@@ -2,13 +2,14 @@ package main
|
||||
|
||||
import (
|
||||
"bytes"
|
||||
"github.com/sergi/go-diff/diffmatchpatch"
|
||||
"html"
|
||||
"html/template"
|
||||
"net/http"
|
||||
"os"
|
||||
"path/filepath"
|
||||
"strings"
|
||||
|
||||
"github.com/sergi/go-diff/diffmatchpatch"
|
||||
)
|
||||
|
||||
func diffHandler(w http.ResponseWriter, r *http.Request, name string) {
|
||||
@@ -18,11 +19,11 @@ func diffHandler(w http.ResponseWriter, r *http.Request, name string) {
|
||||
return
|
||||
}
|
||||
p.handleTitle(true)
|
||||
p.renderHtml()
|
||||
p.renderHTML()
|
||||
renderTemplate(w, p.Dir(), "diff", p)
|
||||
}
|
||||
|
||||
// Diff computes the diff for a page. At this point, renderHtml has already been called so the Name is escaped.
|
||||
// Diff computes the diff for a page. At this point, renderHTML has already been called so the Name is escaped.
|
||||
func (p *Page) Diff() template.HTML {
|
||||
fp := filepath.FromSlash(p.Name)
|
||||
a := fp + ".md~"
|
||||
|
||||
@@ -1,11 +1,12 @@
|
||||
package main
|
||||
|
||||
import (
|
||||
"github.com/stretchr/testify/assert"
|
||||
"net/http"
|
||||
"os"
|
||||
"testing"
|
||||
"time"
|
||||
|
||||
"github.com/stretchr/testify/assert"
|
||||
)
|
||||
|
||||
func TestDiff(t *testing.T) {
|
||||
|
||||
@@ -41,5 +41,5 @@ func saveHandler(w http.ResponseWriter, r *http.Request, name string) {
|
||||
return
|
||||
}
|
||||
}
|
||||
http.Redirect(w, r, "/view/" + nameEscape(name), http.StatusFound)
|
||||
http.Redirect(w, r, "/view/"+nameEscape(name), http.StatusFound)
|
||||
}
|
||||
|
||||
@@ -1,12 +1,13 @@
|
||||
package main
|
||||
|
||||
import (
|
||||
"github.com/stretchr/testify/assert"
|
||||
"net/http"
|
||||
"net/url"
|
||||
"os"
|
||||
"testing"
|
||||
"time"
|
||||
|
||||
"github.com/stretchr/testify/assert"
|
||||
)
|
||||
|
||||
func TestEditSave(t *testing.T) {
|
||||
|
||||
@@ -4,13 +4,14 @@ import (
|
||||
"context"
|
||||
"flag"
|
||||
"fmt"
|
||||
"github.com/google/subcommands"
|
||||
htmlTemplate "html/template"
|
||||
"io"
|
||||
"os"
|
||||
"strings"
|
||||
textTemplate "text/template"
|
||||
"time"
|
||||
|
||||
"github.com/google/subcommands"
|
||||
)
|
||||
|
||||
type exportCmd struct {
|
||||
@@ -62,7 +63,7 @@ func exportCli(w io.Writer, templateName string, idx *indexStore) subcommands.Ex
|
||||
return subcommands.ExitFailure
|
||||
}
|
||||
p.handleTitle(false)
|
||||
p.renderHtml()
|
||||
p.renderHTML()
|
||||
fi, err := os.Stat(name + ".md")
|
||||
if err != nil {
|
||||
fmt.Fprintf(os.Stderr, "Stat %s: %s\n", name, err)
|
||||
@@ -72,7 +73,7 @@ func exportCli(w io.Writer, templateName string, idx *indexStore) subcommands.Ex
|
||||
it.Title = p.Title
|
||||
it.Name = p.Name
|
||||
it.Body = p.Body
|
||||
it.Html = htmlTemplate.HTML(htmlTemplate.HTMLEscaper(p.Html))
|
||||
it.HTML = htmlTemplate.HTML(htmlTemplate.HTMLEscaper(p.HTML))
|
||||
it.Hashtags = p.Hashtags
|
||||
items = append(items, it)
|
||||
}
|
||||
|
||||
@@ -2,11 +2,12 @@ package main
|
||||
|
||||
import (
|
||||
"bytes"
|
||||
"github.com/google/subcommands"
|
||||
"github.com/stretchr/testify/assert"
|
||||
"os"
|
||||
"regexp"
|
||||
"testing"
|
||||
|
||||
"github.com/google/subcommands"
|
||||
"github.com/stretchr/testify/assert"
|
||||
)
|
||||
|
||||
func TestExportCmd(t *testing.T) {
|
||||
@@ -40,7 +41,7 @@ func TestExportCmdJsonFeed(t *testing.T) {
|
||||
"title": "{{.Title}}",
|
||||
"language": "{{.Language}}"
|
||||
"date_modified": "{{.Date}}",
|
||||
"content_html": "{{.Html}}",
|
||||
"content_html": "{{.HTML}}",
|
||||
"tags": [{{range .Hashtags}}"{{.}}",{{end}}""],
|
||||
},{{end}}
|
||||
{}
|
||||
|
||||
115
feed.go
115
feed.go
@@ -2,13 +2,24 @@ package main
|
||||
|
||||
import (
|
||||
"bytes"
|
||||
"github.com/gomarkdown/markdown"
|
||||
"github.com/gomarkdown/markdown/ast"
|
||||
"html/template"
|
||||
"os"
|
||||
"path"
|
||||
"path/filepath"
|
||||
"strconv"
|
||||
"time"
|
||||
|
||||
"github.com/gomarkdown/markdown"
|
||||
"github.com/gomarkdown/markdown/ast"
|
||||
)
|
||||
|
||||
type dateSource int
|
||||
|
||||
const (
|
||||
// ModTime means that the feed item date is based on the page file's last modification date.
|
||||
ModTime dateSource = iota
|
||||
// URL means that the feed item date is based on the ISO date contained in the page name.
|
||||
URL
|
||||
)
|
||||
|
||||
// Item is a Page plus a Date.
|
||||
@@ -34,19 +45,47 @@ type Feed struct {
|
||||
// Items are based on the pages linked in list items starting with an asterisk ("*"). Links in
|
||||
// list items starting with a minus ("-") are ignored!
|
||||
Items []Item
|
||||
|
||||
// From is where the item number where the feed starts. It defaults to 0. Prev and From are the item numbers of
|
||||
// The previous and the next page of the feed. N is the number of items per page. Next goes further into the
|
||||
// past.
|
||||
Prev, Next, From, N int
|
||||
|
||||
// When paging through the index or year pages, link to the next or previous years. NextYear goes further into
|
||||
// the past (is smaller).
|
||||
PrevYear, NextYear int
|
||||
|
||||
// Complete is set when there is no pagination.
|
||||
Complete bool
|
||||
}
|
||||
|
||||
// feed returns a RSS 2.0 feed for any page. The feed items it contains are the pages linked from in list items starting
|
||||
// with an asterisk ("*").
|
||||
func feed(p *Page, ti time.Time) *Feed {
|
||||
// with an asterisk ("*"). The feed starts from a certain item and contains n items. If n is 0, the feed is complete
|
||||
// (unpaginated). The
|
||||
func feed(p *Page, ti time.Time, from, n int, source dateSource) *Feed {
|
||||
feed := new(Feed)
|
||||
feed.Name = p.Name
|
||||
feed.Title = p.Title
|
||||
feed.Date = ti.Format(time.RFC1123Z)
|
||||
feed.From = from
|
||||
feed.N = n
|
||||
switch {
|
||||
case n == 0:
|
||||
feed.Complete = true
|
||||
case from > n:
|
||||
feed.Prev = from - n
|
||||
default:
|
||||
year, err := p.BlogYear()
|
||||
if err == nil && p.ArchiveExists(year+1) {
|
||||
feed.PrevYear = year + 1
|
||||
}
|
||||
}
|
||||
to := from + n
|
||||
parser, _ := wikiParser()
|
||||
doc := markdown.Parse(p.Body, parser)
|
||||
items := make([]Item, 0)
|
||||
inListItem := false
|
||||
i := 0
|
||||
ast.WalkFunc(doc, func(node ast.Node, entering bool) ast.WalkStatus {
|
||||
// set the flag if we're in a list item
|
||||
listItem, ok := node.(*ast.ListItem)
|
||||
@@ -58,33 +97,81 @@ func feed(p *Page, ti time.Time) *Feed {
|
||||
if !inListItem || !entering {
|
||||
return ast.GoToNext
|
||||
}
|
||||
// if we're in a link and it's local
|
||||
// if we're in a link and it's not local
|
||||
link, ok := node.(*ast.Link)
|
||||
if !ok || bytes.Contains(link.Destination, []byte("//")) {
|
||||
return ast.GoToNext
|
||||
}
|
||||
name := path.Join(p.Dir(), string(link.Destination))
|
||||
fi, err := os.Stat(filepath.FromSlash(name) + ".md")
|
||||
if err != nil {
|
||||
// if we're too early or too late
|
||||
i++
|
||||
if i <= from {
|
||||
return ast.GoToNext
|
||||
}
|
||||
if n > 0 && i > to {
|
||||
// set if it's likely that more items exist
|
||||
feed.Next = to
|
||||
return ast.Terminate
|
||||
}
|
||||
// i counts links, not actual existing pages
|
||||
name := path.Join(p.Dir(), string(link.Destination))
|
||||
p2, err := loadPage(name)
|
||||
if err != nil {
|
||||
return ast.GoToNext
|
||||
}
|
||||
p2.handleTitle(false)
|
||||
p2.renderHtml()
|
||||
it := Item{Date: fi.ModTime().Format(time.RFC1123Z)}
|
||||
p2.renderHTML()
|
||||
date, err := p2.Date(source)
|
||||
if err != nil {
|
||||
return ast.GoToNext
|
||||
}
|
||||
it := Item{Date: date.Format(time.RFC1123Z)}
|
||||
it.Title = p2.Title
|
||||
it.Name = p2.Name
|
||||
it.Html = template.HTML(template.HTMLEscaper(p2.Html))
|
||||
it.HTML = template.HTML(template.HTMLEscaper(p2.HTML))
|
||||
it.Hashtags = p2.Hashtags
|
||||
items = append(items, it)
|
||||
if len(items) >= 10 {
|
||||
return ast.Terminate
|
||||
}
|
||||
return ast.GoToNext
|
||||
})
|
||||
// If there are no more "next" links but there is a next page, add it.
|
||||
if feed.Next == 0 {
|
||||
year, err := p.BlogYear()
|
||||
if err == nil && p.ArchiveExists(year-1) {
|
||||
feed.NextYear = year - 1
|
||||
}
|
||||
}
|
||||
feed.Items = items
|
||||
return feed
|
||||
}
|
||||
|
||||
// Date returns the page's last modification date if the data source is ModTime. If the data source is URL, then the
|
||||
// first 10 characters are parsed as an ISO date string and the time returned is for that date, 0:00, UTC.
|
||||
func (p *Page) Date(source dateSource) (time.Time, error) {
|
||||
if source == URL && p.IsBlog() {
|
||||
name := path.Base(p.Name)
|
||||
return time.Parse(time.DateOnly, name[0:10])
|
||||
}
|
||||
return p.ModTime()
|
||||
}
|
||||
|
||||
// BlogYear returns the current year if the page name is "index". If the page name is a number such as "2026" then
|
||||
// this is parsed as an integer and returned.
|
||||
func (p *Page) BlogYear() (int, error) {
|
||||
name := path.Base(p.Name)
|
||||
if name == "index" {
|
||||
return time.Now().Year(), nil
|
||||
}
|
||||
ui, err := strconv.ParseUint(name, 10, 16)
|
||||
if err == nil {
|
||||
return int(ui), nil
|
||||
}
|
||||
return 0, err
|
||||
}
|
||||
|
||||
// ArchiveExists returns true if a page exists in the same directory as the current one with a page name matching
|
||||
// the year given.
|
||||
func (p *Page) ArchiveExists(year int) bool {
|
||||
name := path.Join(p.Dir(), strconv.Itoa(year))
|
||||
fp := filepath.FromSlash(name) + ".md"
|
||||
_, err := os.Stat(fp)
|
||||
return err == nil
|
||||
}
|
||||
|
||||
13
feed.html
13
feed.html
@@ -1,11 +1,18 @@
|
||||
<rss xmlns:atom="http://www.w3.org/2005/Atom" version="2.0">
|
||||
<rss xmlns:atom="http://www.w3.org/2005/Atom" version="2.0"
|
||||
xmlns:fh="http://purl.org/syndication/history/1.0">
|
||||
<channel>
|
||||
<docs>http://blogs.law.harvard.edu/tech/rss</docs>
|
||||
<title>{{.Title}}</title>
|
||||
<link>https://example.org/</link>
|
||||
<managingEditor>you@example.org (Your Name)</managingEditor>
|
||||
<webMaster>you@example.org (Your Name)</webMaster>
|
||||
<atom:link href="https://example.org/view/{{.Path}}.rss" rel="self" type="application/rss+xml"/>
|
||||
<atom:link href="https://example.org/view/{{.Path}}.rss" rel="self" type="application/rss+xml"/>{{if .From}}
|
||||
<atom:link href="https://example.org/view/{{.Path}}.rss?from={{.Prev}}&n={{.N}}" rel="previous" type="application/rss+xml"/>{{end}}{{if .PrevYear}}
|
||||
<atom:link href="https://example.org/view/{{.Dir}}{{.PrevYear}}.rss?n={{.N}}" rel="previous" type="application/rss+xml"/>{{end}}{{if .Next}}
|
||||
<atom:link href="https://example.org/view/{{.Path}}.rss?from={{.Next}}&n={{.N}}" rel="next" type="application/rss+xml"/>{{end}}{{if .NextYear}}
|
||||
<atom:link href="https://example.org/view/{{.Dir}}{{.NextYear}}.rss?n={{.N}}" rel="next" type="application/rss+xml"/>{{end}}{{if .Complete}}
|
||||
<generator>Oddμ https://src.alexschroeder.ch/oddmu.git/</generator>
|
||||
<fh:complete/>{{end}}
|
||||
<description>This is the digital garden of Your Name.</description>
|
||||
<image>
|
||||
<url>https://example.org/view/logo.jpg</url>
|
||||
@@ -17,7 +24,7 @@
|
||||
<title>{{.Title}}</title>
|
||||
<link>https://example.org/view/{{.Path}}</link>
|
||||
<guid>https://example.org/view/{{.Path}}</guid>
|
||||
<description>{{.Html}}</description>
|
||||
<description>{{.HTML}}</description>
|
||||
<pubDate>{{.Date}}</pubDate>
|
||||
{{range .Hashtags}}
|
||||
<category>{{.}}</category>
|
||||
|
||||
90
feed_cmd.go
Normal file
90
feed_cmd.go
Normal file
@@ -0,0 +1,90 @@
|
||||
package main
|
||||
|
||||
import (
|
||||
"context"
|
||||
"flag"
|
||||
"fmt"
|
||||
"io"
|
||||
"os"
|
||||
"strings"
|
||||
"time"
|
||||
|
||||
"github.com/google/subcommands"
|
||||
)
|
||||
|
||||
type feedCmd struct {
|
||||
}
|
||||
|
||||
func (*feedCmd) Name() string { return "feed" }
|
||||
func (*feedCmd) Synopsis() string { return "render a page as feed" }
|
||||
func (*feedCmd) Usage() string {
|
||||
return `feed <page name> ...:
|
||||
Render one or more pages as a single feed.
|
||||
Use a single - to read Markdown from stdin.
|
||||
`
|
||||
}
|
||||
|
||||
func (cmd *feedCmd) SetFlags(f *flag.FlagSet) {
|
||||
}
|
||||
|
||||
func (cmd *feedCmd) Execute(_ context.Context, f *flag.FlagSet, _ ...interface{}) subcommands.ExitStatus {
|
||||
if len(f.Args()) == 0 {
|
||||
fmt.Fprint(os.Stderr, cmd.Usage())
|
||||
return subcommands.ExitFailure
|
||||
}
|
||||
return feedCli(os.Stdout, f.Args())
|
||||
}
|
||||
|
||||
func feedCli(w io.Writer, args []string) subcommands.ExitStatus {
|
||||
if len(args) == 1 && args[0] == "-" {
|
||||
body, err := io.ReadAll(os.Stdin)
|
||||
if err != nil {
|
||||
fmt.Fprintf(os.Stderr, "Cannot read from stdin: %s\n", err)
|
||||
return subcommands.ExitFailure
|
||||
}
|
||||
p := &Page{Name: "stdin", Body: body}
|
||||
return p.printFeed(w, time.Now())
|
||||
}
|
||||
for _, name := range args {
|
||||
if !strings.HasSuffix(name, ".md") {
|
||||
fmt.Fprintf(os.Stderr, "%s does not end in '.md'\n", name)
|
||||
return subcommands.ExitFailure
|
||||
}
|
||||
name = name[0 : len(name)-3]
|
||||
p, err := loadPage(name)
|
||||
p.handleTitle(false)
|
||||
if err != nil {
|
||||
fmt.Fprintf(os.Stderr, "Cannot load %s: %s\n", name, err)
|
||||
return subcommands.ExitFailure
|
||||
}
|
||||
ti, _ := p.ModTime()
|
||||
status := p.printFeed(w, ti)
|
||||
if status != subcommands.ExitSuccess {
|
||||
return status
|
||||
}
|
||||
}
|
||||
return subcommands.ExitSuccess
|
||||
}
|
||||
|
||||
// printFeed prints the complete feed for a page (unpaginated).
|
||||
func (p *Page) printFeed(w io.Writer, ti time.Time) subcommands.ExitStatus {
|
||||
f := feed(p, ti, 0, 0, URL)
|
||||
if len(f.Items) == 0 {
|
||||
fmt.Fprintf(os.Stderr, "Empty feed for %s\n", p.Name)
|
||||
return subcommands.ExitFailure
|
||||
}
|
||||
_, err := w.Write([]byte(`<?xml version="1.0" encoding="UTF-8"?>`))
|
||||
if err != nil {
|
||||
fmt.Fprintf(os.Stderr, "Cannot write prefix: %s\n", err)
|
||||
return subcommands.ExitFailure
|
||||
}
|
||||
initTemplates()
|
||||
templates.RLock()
|
||||
defer templates.RUnlock()
|
||||
err = templates.template["feed.html"].Execute(w, f)
|
||||
if err != nil {
|
||||
fmt.Fprintf(os.Stderr, "Cannot execute template: %s\n", err)
|
||||
return subcommands.ExitFailure
|
||||
}
|
||||
return subcommands.ExitSuccess
|
||||
}
|
||||
26
feed_cmd_test.go
Normal file
26
feed_cmd_test.go
Normal file
@@ -0,0 +1,26 @@
|
||||
package main
|
||||
|
||||
import (
|
||||
"bytes"
|
||||
"testing"
|
||||
|
||||
"github.com/google/subcommands"
|
||||
"github.com/stretchr/testify/assert"
|
||||
)
|
||||
|
||||
func TestFeedCmd(t *testing.T) {
|
||||
cleanup(t, "testdata/complete")
|
||||
p := &Page{Name: "testdata/complete/2025-12-01", Body: []byte("# 2025-12-01\n")}
|
||||
p.save()
|
||||
p = &Page{Name: "testdata/complete/index", Body: []byte(`# Index
|
||||
* [2025-12-01](2025-12-01)
|
||||
`)}
|
||||
p.save()
|
||||
|
||||
b := new(bytes.Buffer)
|
||||
s := feedCli(b, []string{"testdata/complete/index.md"})
|
||||
assert.Equal(t, subcommands.ExitSuccess, s)
|
||||
assert.Contains(t, b.String(), "<fh:complete/>")
|
||||
assert.Contains(t, b.String(), "<title>2025-12-01</title>")
|
||||
assert.Contains(t, b.String(), "<pubDate>Mon, 01 Dec 2025 00:00:00") // ignore timezone
|
||||
}
|
||||
151
feed_test.go
151
feed_test.go
@@ -1,9 +1,12 @@
|
||||
package main
|
||||
|
||||
import (
|
||||
"github.com/stretchr/testify/assert"
|
||||
"fmt"
|
||||
"net/http"
|
||||
"net/url"
|
||||
"testing"
|
||||
|
||||
"github.com/stretchr/testify/assert"
|
||||
)
|
||||
|
||||
func TestFeed(t *testing.T) {
|
||||
@@ -19,7 +22,6 @@ func TestNoFeed(t *testing.T) {
|
||||
|
||||
func TestFeedItems(t *testing.T) {
|
||||
cleanup(t, "testdata/feed")
|
||||
index.load()
|
||||
|
||||
p1 := &Page{Name: "testdata/feed/cactus", Body: []byte(`# Cactus
|
||||
Green head and white hair
|
||||
@@ -53,3 +55,148 @@ Writing poems about plants.
|
||||
assert.Contains(t, body, "<category>Succulent</category>")
|
||||
assert.Contains(t, body, "<category>Palmtree</category>")
|
||||
}
|
||||
|
||||
func TestFeedPagination(t *testing.T) {
|
||||
cleanup(t, "testdata/pagination")
|
||||
|
||||
p := &Page{Name: "testdata/pagination/one", Body: []byte("# One\n")}
|
||||
p.save()
|
||||
p = &Page{Name: "testdata/pagination/two", Body: []byte("# Two\n")}
|
||||
p.save()
|
||||
p = &Page{Name: "testdata/pagination/three", Body: []byte("# Three\n")}
|
||||
p.save()
|
||||
p = &Page{Name: "testdata/pagination/four", Body: []byte("# Four\n")}
|
||||
p.save()
|
||||
p = &Page{Name: "testdata/pagination/five", Body: []byte("# Five\n")}
|
||||
p.save()
|
||||
p = &Page{Name: "testdata/pagination/six", Body: []byte("# Six\n")}
|
||||
p.save()
|
||||
p = &Page{Name: "testdata/pagination/seven", Body: []byte("# Seven\n")}
|
||||
p.save()
|
||||
p = &Page{Name: "testdata/pagination/eight", Body: []byte("# Eight\n")}
|
||||
p.save()
|
||||
p = &Page{Name: "testdata/pagination/nine", Body: []byte("# Nine\n")}
|
||||
p.save()
|
||||
p = &Page{Name: "testdata/pagination/ten", Body: []byte("# Ten\n")}
|
||||
p.save()
|
||||
|
||||
p = &Page{Name: "testdata/pagination/index", Body: []byte(`# Index
|
||||
* [one](one)
|
||||
* [two](two)
|
||||
* [three](three)
|
||||
* [four](four)
|
||||
* [five](five)
|
||||
* [six](six)
|
||||
* [seven](seven)
|
||||
* [eight](eight)
|
||||
* [nine](nine)
|
||||
* [ten](ten)
|
||||
`)}
|
||||
p.save()
|
||||
|
||||
body := assert.HTTPBody(makeHandler(viewHandler, false, http.MethodGet), "GET", "/view/testdata/pagination/index.rss", nil)
|
||||
assert.Contains(t, body, "<title>One</title>")
|
||||
assert.Contains(t, body, "<title>Ten</title>")
|
||||
assert.NotContains(t, body, `<atom:link href="https://example.org/view/testdata/pagination/index.rss?from=10&n=10" rel="next" type="application/rss+xml"/>`)
|
||||
|
||||
p = &Page{Name: "testdata/pagination/eleven", Body: []byte("# Eleven\n")}
|
||||
p.save()
|
||||
p = &Page{Name: "testdata/pagination/index", Body: []byte(`# Index
|
||||
* [one](one)
|
||||
* [two](two)
|
||||
* [three](three)
|
||||
* [four](four)
|
||||
* [five](five)
|
||||
* [six](six)
|
||||
* [seven](seven)
|
||||
* [eight](eight)
|
||||
* [nine](nine)
|
||||
* [ten](ten)
|
||||
* [eleven](eleven)
|
||||
`)}
|
||||
p.save()
|
||||
|
||||
body = assert.HTTPBody(makeHandler(viewHandler, false, http.MethodGet), "GET", "/view/testdata/pagination/index.rss", nil)
|
||||
assert.NotContains(t, body, "<title>Eleven</title>")
|
||||
assert.Contains(t, body, `<atom:link href="https://example.org/view/testdata/pagination/index.rss?from=10&n=10" rel="next" type="application/rss+xml"/>`)
|
||||
|
||||
params := url.Values{}
|
||||
params.Set("n", "0")
|
||||
body = assert.HTTPBody(makeHandler(viewHandler, false, http.MethodGet), "GET", "/view/testdata/pagination/index.rss", params)
|
||||
assert.Contains(t, body, "<title>Eleven</title>")
|
||||
assert.Contains(t, body, `<fh:complete/>`)
|
||||
|
||||
params = url.Values{}
|
||||
params.Set("n", "3")
|
||||
body = assert.HTTPBody(makeHandler(viewHandler, false, http.MethodGet), "GET", "/view/testdata/pagination/index.rss", params)
|
||||
assert.Contains(t, body, "<title>One</title>")
|
||||
assert.Contains(t, body, "<title>Three</title>")
|
||||
assert.NotContains(t, body, "<title>Four</title>")
|
||||
assert.Contains(t, body, `<atom:link href="https://example.org/view/testdata/pagination/index.rss?from=3&n=3" rel="next" type="application/rss+xml"/>`)
|
||||
|
||||
params = url.Values{}
|
||||
params.Set("from", "3")
|
||||
params.Set("n", "3")
|
||||
body = assert.HTTPBody(makeHandler(viewHandler, false, http.MethodGet), "GET", "/view/testdata/pagination/index.rss", params)
|
||||
assert.NotContains(t, body, "<title>Three</title>")
|
||||
assert.Contains(t, body, "<title>Four</title>")
|
||||
assert.Contains(t, body, "<title>Six</title>")
|
||||
assert.NotContains(t, body, "<title>Seven</title>")
|
||||
assert.Contains(t, body, `<atom:link href="https://example.org/view/testdata/pagination/index.rss?from=0&n=3" rel="previous" type="application/rss+xml"/>`)
|
||||
assert.Contains(t, body, `<atom:link href="https://example.org/view/testdata/pagination/index.rss?from=6&n=3" rel="next" type="application/rss+xml"/>`)
|
||||
|
||||
params = url.Values{}
|
||||
params.Set("from", "2")
|
||||
params.Set("n", "3")
|
||||
body = assert.HTTPBody(makeHandler(viewHandler, false, http.MethodGet), "GET", "/view/testdata/pagination/index.rss", params)
|
||||
assert.NotContains(t, body, "<title>Two</title>")
|
||||
assert.Contains(t, body, "<title>Three</title>")
|
||||
assert.Contains(t, body, "<title>Five</title>")
|
||||
assert.NotContains(t, body, "<title>Six</title>")
|
||||
assert.Contains(t, body, `<atom:link href="https://example.org/view/testdata/pagination/index.rss?from=0&n=3" rel="previous" type="application/rss+xml"/>`)
|
||||
assert.Contains(t, body, `<atom:link href="https://example.org/view/testdata/pagination/index.rss?from=5&n=3" rel="next" type="application/rss+xml"/>`)
|
||||
}
|
||||
|
||||
func TestFeedYearArchives(t *testing.T) {
|
||||
cleanup(t, "testdata/archives")
|
||||
p := &Page{Name: "testdata/archives/index", Body: []byte(`# Archives
|
||||
my bent fingers hurt
|
||||
keyboard rattling in the dark
|
||||
but no child in sight
|
||||
`)}
|
||||
p.save()
|
||||
body := assert.HTTPBody(makeHandler(viewHandler, false, http.MethodGet), "GET",
|
||||
"/view/testdata/archives/index.rss", nil)
|
||||
year, err := p.BlogYear()
|
||||
assert.Greater(t, year, 0)
|
||||
assert.NoError(t, err)
|
||||
prevLink := fmt.Sprintf(`<atom:link href="https://example.org/view/testdata/archives/%d.rss?n=10" rel="previous" type="application/rss+xml"/>`, year+1)
|
||||
nextLink := fmt.Sprintf(`<atom:link href="https://example.org/view/testdata/archives/%d.rss?n=10" rel="next" type="application/rss+xml"/>`, year-1)
|
||||
assert.NotContains(t, body, prevLink)
|
||||
assert.NotContains(t, body, nextLink)
|
||||
|
||||
p = &Page{Name: fmt.Sprintf("testdata/archives/%d", year-1), Body: []byte(`# Previously
|
||||
I have seen it all
|
||||
invasion and denial
|
||||
and cold winter hearts
|
||||
`)}
|
||||
p.save()
|
||||
|
||||
body = assert.HTTPBody(makeHandler(viewHandler, false, http.MethodGet), "GET",
|
||||
"/view/testdata/archives/index.rss", nil)
|
||||
assert.NotContains(t, body, prevLink)
|
||||
assert.Contains(t, body, nextLink)
|
||||
|
||||
p = &Page{Name: fmt.Sprintf("testdata/archives/%d", year+1), Body: []byte(`# Coming
|
||||
A night of thunder
|
||||
lightning, children, it's the war
|
||||
of our New Year's Eve
|
||||
`)}
|
||||
p.save()
|
||||
|
||||
body = assert.HTTPBody(makeHandler(viewHandler, false, http.MethodGet), "GET",
|
||||
"/view/testdata/archives/index.rss", nil)
|
||||
assert.Contains(t, body, prevLink)
|
||||
assert.Contains(t, body, nextLink)
|
||||
|
||||
}
|
||||
|
||||
43
go.mod
43
go.mod
@@ -1,41 +1,40 @@
|
||||
module alexschroeder.ch/cgit/oddmu
|
||||
module src.alexschroeder.ch/oddmu
|
||||
|
||||
go 1.22
|
||||
|
||||
toolchain go1.22.3
|
||||
go 1.25
|
||||
|
||||
require (
|
||||
github.com/disintegration/imaging v1.6.2
|
||||
github.com/edwvee/exiffix v0.0.0-20210922235313-0f6cbda5e58f
|
||||
github.com/fsnotify/fsnotify v1.7.0
|
||||
github.com/gabriel-vasile/mimetype v1.4.3
|
||||
github.com/gen2brain/heic v0.3.1
|
||||
github.com/gen2brain/webp v0.5.2
|
||||
github.com/gomarkdown/markdown v0.0.0-20250207164621-7a1f277a159e
|
||||
github.com/edwvee/exiffix v0.0.0-20240229113213-0dbb146775be
|
||||
github.com/fsnotify/fsnotify v1.9.0
|
||||
github.com/gabriel-vasile/mimetype v1.4.13
|
||||
github.com/gen2brain/heic v0.4.9
|
||||
github.com/gen2brain/webp v0.5.5
|
||||
github.com/gomarkdown/markdown v0.0.0-20250810172220-2e2c11897d1a
|
||||
github.com/google/subcommands v1.2.0
|
||||
github.com/hexops/gotextdiff v1.0.3
|
||||
github.com/microcosm-cc/bluemonday v1.0.26
|
||||
github.com/microcosm-cc/bluemonday v1.0.27
|
||||
github.com/muesli/reflow v0.3.0
|
||||
github.com/pemistahl/lingua-go v1.4.0
|
||||
github.com/sergi/go-diff v1.3.1
|
||||
github.com/sergi/go-diff v1.4.0
|
||||
github.com/stretchr/testify v1.8.4
|
||||
golang.org/x/exp v0.0.0-20240119083558-1b970713d09a
|
||||
golang.org/x/exp v0.0.0-20260112195511-716be5621a96
|
||||
)
|
||||
|
||||
require (
|
||||
github.com/aymerick/douceur v0.2.0 // indirect
|
||||
github.com/clipperhouse/stringish v0.1.1 // indirect
|
||||
github.com/clipperhouse/uax29/v2 v2.5.0 // indirect
|
||||
github.com/davecgh/go-spew v1.1.1 // indirect
|
||||
github.com/ebitengine/purego v0.8.1 // indirect
|
||||
github.com/ebitengine/purego v0.9.1 // indirect
|
||||
github.com/gorilla/css v1.0.1 // indirect
|
||||
github.com/mattn/go-runewidth v0.0.15 // indirect
|
||||
github.com/mattn/go-runewidth v0.0.19 // indirect
|
||||
github.com/pmezard/go-difflib v1.0.0 // indirect
|
||||
github.com/rivo/uniseg v0.4.6 // indirect
|
||||
github.com/rwcarlsen/goexif v0.0.0-20190401172101-9e8deecbddbd // indirect
|
||||
github.com/shopspring/decimal v1.3.1 // indirect
|
||||
github.com/tetratelabs/wazero v1.8.1 // indirect
|
||||
golang.org/x/image v0.15.0 // indirect
|
||||
golang.org/x/net v0.20.0 // indirect
|
||||
golang.org/x/sys v0.21.0 // indirect
|
||||
google.golang.org/protobuf v1.32.0 // indirect
|
||||
github.com/shopspring/decimal v1.4.0 // indirect
|
||||
github.com/tetratelabs/wazero v1.11.0 // indirect
|
||||
golang.org/x/image v0.35.0 // indirect
|
||||
golang.org/x/net v0.49.0 // indirect
|
||||
golang.org/x/sys v0.40.0 // indirect
|
||||
google.golang.org/protobuf v1.36.11 // indirect
|
||||
gopkg.in/yaml.v3 v3.0.1 // indirect
|
||||
)
|
||||
|
||||
78
go.sum
78
go.sum
@@ -1,26 +1,30 @@
|
||||
github.com/aymerick/douceur v0.2.0 h1:Mv+mAeH1Q+n9Fr+oyamOlAkUNPWPlA8PPGR0QAaYuPk=
|
||||
github.com/aymerick/douceur v0.2.0/go.mod h1:wlT5vV2O3h55X9m7iVYN0TBM0NH/MmbLnd30/FjWUq4=
|
||||
github.com/clipperhouse/stringish v0.1.1 h1:+NSqMOr3GR6k1FdRhhnXrLfztGzuG+VuFDfatpWHKCs=
|
||||
github.com/clipperhouse/stringish v0.1.1/go.mod h1:v/WhFtE1q0ovMta2+m+UbpZ+2/HEXNWYXQgCt4hdOzA=
|
||||
github.com/clipperhouse/uax29/v2 v2.5.0 h1:x7T0T4eTHDONxFJsL94uKNKPHrclyFI0lm7+w94cO8U=
|
||||
github.com/clipperhouse/uax29/v2 v2.5.0/go.mod h1:Wn1g7MK6OoeDT0vL+Q0SQLDz/KpfsVRgg6W7ihQeh4g=
|
||||
github.com/davecgh/go-spew v1.1.0/go.mod h1:J7Y8YcW2NihsgmVo/mv3lAwl/skON4iLHjSsI+c5H38=
|
||||
github.com/davecgh/go-spew v1.1.1 h1:vj9j/u1bqnvCEfJOwUhtlOARqs3+rkHYY13jYWTU97c=
|
||||
github.com/davecgh/go-spew v1.1.1/go.mod h1:J7Y8YcW2NihsgmVo/mv3lAwl/skON4iLHjSsI+c5H38=
|
||||
github.com/disintegration/imaging v1.6.2 h1:w1LecBlG2Lnp8B3jk5zSuNqd7b4DXhcjwek1ei82L+c=
|
||||
github.com/disintegration/imaging v1.6.2/go.mod h1:44/5580QXChDfwIclfc/PCwrr44amcmDAg8hxG0Ewe4=
|
||||
github.com/ebitengine/purego v0.8.1 h1:sdRKd6plj7KYW33EH5As6YKfe8m9zbN9JMrOjNVF/BE=
|
||||
github.com/ebitengine/purego v0.8.1/go.mod h1:iIjxzd6CiRiOG0UyXP+V1+jWqUXVjPKLAI0mRfJZTmQ=
|
||||
github.com/edwvee/exiffix v0.0.0-20210922235313-0f6cbda5e58f h1:RMnUwTnNR070mFAEIoqMYjNirHj8i0h79VXTYyBCyVA=
|
||||
github.com/edwvee/exiffix v0.0.0-20210922235313-0f6cbda5e58f/go.mod h1:KoE3Ti1qbQXCb3s/XGj0yApHnbnNnn1bXTtB5Auq/Vc=
|
||||
github.com/fsnotify/fsnotify v1.7.0 h1:8JEhPFa5W2WU7YfeZzPNqzMP6Lwt7L2715Ggo0nosvA=
|
||||
github.com/fsnotify/fsnotify v1.7.0/go.mod h1:40Bi/Hjc2AVfZrqy+aj+yEI+/bRxZnMJyTJwOpGvigM=
|
||||
github.com/gabriel-vasile/mimetype v1.4.3 h1:in2uUcidCuFcDKtdcBxlR0rJ1+fsokWf+uqxgUFjbI0=
|
||||
github.com/gabriel-vasile/mimetype v1.4.3/go.mod h1:d8uq/6HKRL6CGdk+aubisF/M5GcPfT7nKyLpA0lbSSk=
|
||||
github.com/gen2brain/heic v0.3.1 h1:ClY5YTdXdIanw7pe9ZVUM9XcsqH6CCCa5CZBlm58qOs=
|
||||
github.com/gen2brain/heic v0.3.1/go.mod h1:m2sVIf02O7wfO8mJm+PvE91lnq4QYJy2hseUon7So10=
|
||||
github.com/gen2brain/webp v0.5.2 h1:aYdjbU/2L98m+bqUdkYMOIY93YC+EN3HuZLMaqgMD9U=
|
||||
github.com/gen2brain/webp v0.5.2/go.mod h1:Nb3xO5sy6MeUAHhru9H3GT7nlOQO5dKRNNlE92CZrJw=
|
||||
github.com/gomarkdown/markdown v0.0.0-20250207164621-7a1f277a159e h1:ESHlT0RVZphh4JGBz49I5R6nTdC8Qyc08vU25GQHzzQ=
|
||||
github.com/gomarkdown/markdown v0.0.0-20250207164621-7a1f277a159e/go.mod h1:JDGcbDT52eL4fju3sZ4TeHGsQwhG9nbDV21aMyhwPoA=
|
||||
github.com/google/go-cmp v0.5.8 h1:e6P7q2lk1O+qJJb4BtCQXlK8vWEO8V1ZeuEdJNOqZyg=
|
||||
github.com/google/go-cmp v0.5.8/go.mod h1:17dUlkBOakJ0+DkrSSNjCkIjxS6bF9zb3elmeNGIjoY=
|
||||
github.com/ebitengine/purego v0.9.1 h1:a/k2f2HQU3Pi399RPW1MOaZyhKJL9w/xFpKAg4q1s0A=
|
||||
github.com/ebitengine/purego v0.9.1/go.mod h1:iIjxzd6CiRiOG0UyXP+V1+jWqUXVjPKLAI0mRfJZTmQ=
|
||||
github.com/edwvee/exiffix v0.0.0-20240229113213-0dbb146775be h1:FNPYI8/ifKGW7kdBdlogyGGaPXZmOXBbV1uz4Amr3s0=
|
||||
github.com/edwvee/exiffix v0.0.0-20240229113213-0dbb146775be/go.mod h1:G3dK5MziX9e4jUa8PWjowCOPCcyQwxsZ5a0oYA73280=
|
||||
github.com/fsnotify/fsnotify v1.9.0 h1:2Ml+OJNzbYCTzsxtv8vKSFD9PbJjmhYF14k/jKC7S9k=
|
||||
github.com/fsnotify/fsnotify v1.9.0/go.mod h1:8jBTzvmWwFyi3Pb8djgCCO5IBqzKJ/Jwo8TRcHyHii0=
|
||||
github.com/gabriel-vasile/mimetype v1.4.13 h1:46nXokslUBsAJE/wMsp5gtO500a4F3Nkz9Ufpk2AcUM=
|
||||
github.com/gabriel-vasile/mimetype v1.4.13/go.mod h1:d+9Oxyo1wTzWdyVUPMmXFvp4F9tea18J8ufA774AB3s=
|
||||
github.com/gen2brain/heic v0.4.9 h1:sHM7kjMV2+AlrSfYNsJiD0NrqAJqIMWAOqMSZ0HXrU8=
|
||||
github.com/gen2brain/heic v0.4.9/go.mod h1:0/0SrVQnUhOA3ekFY5/lApZYniF/DsgS3g9COWe83dM=
|
||||
github.com/gen2brain/webp v0.5.5 h1:MvQR75yIPU/9nSqYT5h13k4URaJK3gf9tgz/ksRbyEg=
|
||||
github.com/gen2brain/webp v0.5.5/go.mod h1:xOSMzp4aROt2KFW++9qcK/RBTOVC2S9tJG66ip/9Oc0=
|
||||
github.com/gomarkdown/markdown v0.0.0-20250810172220-2e2c11897d1a h1:l7A0loSszR5zHd/qK53ZIHMO8b3bBSmENnQ6eKnUT0A=
|
||||
github.com/gomarkdown/markdown v0.0.0-20250810172220-2e2c11897d1a/go.mod h1:JDGcbDT52eL4fju3sZ4TeHGsQwhG9nbDV21aMyhwPoA=
|
||||
github.com/google/go-cmp v0.7.0 h1:wk8382ETsv4JYUZwIsn6YpYiWiBsYLSJiTsyBybVuN8=
|
||||
github.com/google/go-cmp v0.7.0/go.mod h1:pXiqmnSA92OHEEa9HXL2W4E7lf9JzCmGVUdgjX3N/iU=
|
||||
github.com/google/subcommands v1.2.0 h1:vWQspBTo2nEqTUFita5/KeEWlUL8kQObDFbub/EN9oE=
|
||||
github.com/google/subcommands v1.2.0/go.mod h1:ZjhPrFU+Olkh9WazFPsl27BQ4UPiG37m3yTrtFlrHVk=
|
||||
github.com/gorilla/css v1.0.1 h1:ntNaBIghp6JmvWnxbZKANoLyuXTPZ4cAMlo6RyhlbO8=
|
||||
@@ -33,10 +37,10 @@ github.com/kr/pty v1.1.1/go.mod h1:pFQYn66WHrOpPYNljwOMqo10TkYh1fy3cYio2l3bCsQ=
|
||||
github.com/kr/text v0.1.0 h1:45sCR5RtlFHMR4UwH9sdQ5TC8v0qDQCHnXt+kaKSTVE=
|
||||
github.com/kr/text v0.1.0/go.mod h1:4Jbv+DJW3UT/LiOwJeYQe1efqtUx/iVham/4vfdArNI=
|
||||
github.com/mattn/go-runewidth v0.0.12/go.mod h1:RAqKPSqVFrSLVXbA8x7dzmKdmGzieGRCM46jaSJTDAk=
|
||||
github.com/mattn/go-runewidth v0.0.15 h1:UNAjwbU9l54TA3KzvqLGxwWjHmMgBUVhBiTjelZgg3U=
|
||||
github.com/mattn/go-runewidth v0.0.15/go.mod h1:Jdepj2loyihRzMpdS35Xk/zdY8IAYHsh153qUoGf23w=
|
||||
github.com/microcosm-cc/bluemonday v1.0.26 h1:xbqSvqzQMeEHCqMi64VAs4d8uy6Mequs3rQ0k/Khz58=
|
||||
github.com/microcosm-cc/bluemonday v1.0.26/go.mod h1:JyzOCs9gkyQyjs+6h10UEVSe02CGwkhd72Xdqh78TWs=
|
||||
github.com/mattn/go-runewidth v0.0.19 h1:v++JhqYnZuu5jSKrk9RbgF5v4CGUjqRfBm05byFGLdw=
|
||||
github.com/mattn/go-runewidth v0.0.19/go.mod h1:XBkDxAl56ILZc9knddidhrOlY5R/pDhgLpndooCuJAs=
|
||||
github.com/microcosm-cc/bluemonday v1.0.27 h1:MpEUotklkwCSLeH+Qdx1VJgNqLlpY2KXwXFM08ygZfk=
|
||||
github.com/microcosm-cc/bluemonday v1.0.27/go.mod h1:jFi9vgW+H7c3V0lb6nR74Ib/DIB5OBs92Dimizgw2cA=
|
||||
github.com/muesli/reflow v0.3.0 h1:IFsN6K9NfGtjeggFP+68I4chLZV2yIKsXJFNZ+eWh6s=
|
||||
github.com/muesli/reflow v0.3.0/go.mod h1:pbwTDkVPibjO2kyvBQRBxTWEEGDGq0FlB1BIKtnHY/8=
|
||||
github.com/pemistahl/lingua-go v1.4.0 h1:ifYhthrlW7iO4icdubwlduYnmwU37V1sbNrwhKBR4rM=
|
||||
@@ -45,32 +49,30 @@ github.com/pmezard/go-difflib v1.0.0 h1:4DBwDE0NGyQoBHbLQYPwSUPoCMWR5BEzIk/f1lZb
|
||||
github.com/pmezard/go-difflib v1.0.0/go.mod h1:iKH77koFhYxTK1pcRnkKkqfTogsbg7gZNVY4sRDYZ/4=
|
||||
github.com/rivo/uniseg v0.1.0/go.mod h1:J6wj4VEh+S6ZtnVlnTBMWIodfgj8LQOQFoIToxlJtxc=
|
||||
github.com/rivo/uniseg v0.2.0/go.mod h1:J6wj4VEh+S6ZtnVlnTBMWIodfgj8LQOQFoIToxlJtxc=
|
||||
github.com/rivo/uniseg v0.4.6 h1:Sovz9sDSwbOz9tgUy8JpT+KgCkPYJEN/oYzlJiYTNLg=
|
||||
github.com/rivo/uniseg v0.4.6/go.mod h1:FN3SvrM+Zdj16jyLfmOkMNblXMcoc8DfTHruCPUcx88=
|
||||
github.com/rwcarlsen/goexif v0.0.0-20190401172101-9e8deecbddbd h1:CmH9+J6ZSsIjUK3dcGsnCnO41eRBOnY12zwkn5qVwgc=
|
||||
github.com/rwcarlsen/goexif v0.0.0-20190401172101-9e8deecbddbd/go.mod h1:hPqNNc0+uJM6H+SuU8sEs5K5IQeKccPqeSjfgcKGgPk=
|
||||
github.com/sergi/go-diff v1.3.1 h1:xkr+Oxo4BOQKmkn/B9eMK0g5Kg/983T9DqqPHwYqD+8=
|
||||
github.com/sergi/go-diff v1.3.1/go.mod h1:aMJSSKb2lpPvRNec0+w3fl7LP9IOFzdc9Pa4NFbPK1I=
|
||||
github.com/shopspring/decimal v1.3.1 h1:2Usl1nmF/WZucqkFZhnfFYxxxu8LG21F6nPQBE5gKV8=
|
||||
github.com/shopspring/decimal v1.3.1/go.mod h1:DKyhrW/HYNuLGql+MJL6WCR6knT2jwCFRcu2hWCYk4o=
|
||||
github.com/sergi/go-diff v1.4.0 h1:n/SP9D5ad1fORl+llWyN+D6qoUETXNZARKjyY2/KVCw=
|
||||
github.com/sergi/go-diff v1.4.0/go.mod h1:A0bzQcvG0E7Rwjx0REVgAGH58e96+X0MeOfepqsbeW4=
|
||||
github.com/shopspring/decimal v1.4.0 h1:bxl37RwXBklmTi0C79JfXCEBD1cqqHt0bbgBAGFp81k=
|
||||
github.com/shopspring/decimal v1.4.0/go.mod h1:gawqmDU56v4yIKSwfBSFip1HdCCXN8/+DMd9qYNcwME=
|
||||
github.com/stretchr/objx v0.1.0/go.mod h1:HFkY916IF+rwdDfMAkV7OtwuqBVzrE8GR6GFx+wExME=
|
||||
github.com/stretchr/testify v1.4.0/go.mod h1:j7eGeouHqKxXV5pUuKE4zz7dFj8WfuZ+81PSLYec5m4=
|
||||
github.com/stretchr/testify v1.8.4 h1:CcVxjf3Q8PM0mHUKJCdn+eZZtm5yQwehR5yeSVQQcUk=
|
||||
github.com/stretchr/testify v1.8.4/go.mod h1:sz/lmYIOXD/1dqDmKjjqLyZ2RngseejIcXlSw2iwfAo=
|
||||
github.com/tetratelabs/wazero v1.8.1 h1:NrcgVbWfkWvVc4UtT4LRLDf91PsOzDzefMdwhLfA550=
|
||||
github.com/tetratelabs/wazero v1.8.1/go.mod h1:yAI0XTsMBhREkM/YDAK/zNou3GoiAce1P6+rp/wQhjs=
|
||||
golang.org/x/exp v0.0.0-20240119083558-1b970713d09a h1:Q8/wZp0KX97QFTc2ywcOE0YRjZPVIx+MXInMzdvQqcA=
|
||||
golang.org/x/exp v0.0.0-20240119083558-1b970713d09a/go.mod h1:idGWGoKP1toJGkd5/ig9ZLuPcZBC3ewk7SzmH0uou08=
|
||||
github.com/tetratelabs/wazero v1.11.0 h1:+gKemEuKCTevU4d7ZTzlsvgd1uaToIDtlQlmNbwqYhA=
|
||||
github.com/tetratelabs/wazero v1.11.0/go.mod h1:eV28rsN8Q+xwjogd7f4/Pp4xFxO7uOGbLcD/LzB1wiU=
|
||||
golang.org/x/exp v0.0.0-20260112195511-716be5621a96 h1:Z/6YuSHTLOHfNFdb8zVZomZr7cqNgTJvA8+Qz75D8gU=
|
||||
golang.org/x/exp v0.0.0-20260112195511-716be5621a96/go.mod h1:nzimsREAkjBCIEFtHiYkrJyT+2uy9YZJB7H1k68CXZU=
|
||||
golang.org/x/image v0.0.0-20191009234506-e7c1f5e7dbb8/go.mod h1:FeLwcggjj3mMvU+oOTbSwawSJRM1uh48EjtB4UJZlP0=
|
||||
golang.org/x/image v0.15.0 h1:kOELfmgrmJlw4Cdb7g/QGuB3CvDrXbqEIww/pNtNBm8=
|
||||
golang.org/x/image v0.15.0/go.mod h1:HUYqC05R2ZcZ3ejNQsIHQDQiwWM4JBqmm6MKANTp4LE=
|
||||
golang.org/x/net v0.20.0 h1:aCL9BSgETF1k+blQaYUBx9hJ9LOGP3gAVemcZlf1Kpo=
|
||||
golang.org/x/net v0.20.0/go.mod h1:z8BVo6PvndSri0LbOE3hAn0apkU+1YvI6E70E9jsnvY=
|
||||
golang.org/x/sys v0.21.0 h1:rF+pYz3DAGSQAxAu1CbC7catZg4ebC4UIeIhKxBZvws=
|
||||
golang.org/x/sys v0.21.0/go.mod h1:/VUhepiaJMQUp4+oa/7Zr1D23ma6VTLIYjOOTFZPUcA=
|
||||
golang.org/x/image v0.35.0 h1:LKjiHdgMtO8z7Fh18nGY6KDcoEtVfsgLDPeLyguqb7I=
|
||||
golang.org/x/image v0.35.0/go.mod h1:MwPLTVgvxSASsxdLzKrl8BRFuyqMyGhLwmC+TO1Sybk=
|
||||
golang.org/x/net v0.49.0 h1:eeHFmOGUTtaaPSGNmjBKpbng9MulQsJURQUAfUwY++o=
|
||||
golang.org/x/net v0.49.0/go.mod h1:/ysNB2EvaqvesRkuLAyjI1ycPZlQHM3q01F02UY/MV8=
|
||||
golang.org/x/sys v0.40.0 h1:DBZZqJ2Rkml6QMQsZywtnjnnGvHza6BTfYFWY9kjEWQ=
|
||||
golang.org/x/sys v0.40.0/go.mod h1:OgkHotnGiDImocRcuBABYBEXf8A9a87e/uXjp9XT3ks=
|
||||
golang.org/x/text v0.3.0/go.mod h1:NqM8EUOU14njkJ3fqMW+pc6Ldnwhi/IjpwHt7yyuwOQ=
|
||||
google.golang.org/protobuf v1.32.0 h1:pPC6BG5ex8PDFnkbrGU3EixyhKcQ2aDuBS36lqK/C7I=
|
||||
google.golang.org/protobuf v1.32.0/go.mod h1:c6P6GXX6sHbq/GpV6MGZEdwhWPcYBgnhAHhKbcUYpos=
|
||||
google.golang.org/protobuf v1.36.11 h1:fV6ZwhNocDyBLK0dj+fg8ektcVegBBuEolpbTQyBNVE=
|
||||
google.golang.org/protobuf v1.36.11/go.mod h1:HTf+CrKn2C3g5S8VImy6tdcUvCska2kB7j23XfzDpco=
|
||||
gopkg.in/check.v1 v0.0.0-20161208181325-20d25e280405/go.mod h1:Co6ibVJAznAaIkqp8huTwlJQCZ016jof/cbN4VW5Yz0=
|
||||
gopkg.in/check.v1 v1.0.0-20190902080502-41f04d3bba15 h1:YR8cESwS4TdDjEe65xsg0ogRM/Nc3DYOhEAlW+xobZo=
|
||||
gopkg.in/check.v1 v1.0.0-20190902080502-41f04d3bba15/go.mod h1:Co6ibVJAznAaIkqp8huTwlJQCZ016jof/cbN4VW5Yz0=
|
||||
|
||||
@@ -4,17 +4,18 @@ import (
|
||||
"context"
|
||||
"flag"
|
||||
"fmt"
|
||||
"github.com/google/subcommands"
|
||||
"github.com/gomarkdown/markdown"
|
||||
"github.com/gomarkdown/markdown/ast"
|
||||
"github.com/hexops/gotextdiff"
|
||||
"github.com/hexops/gotextdiff/myers"
|
||||
"github.com/hexops/gotextdiff/span"
|
||||
"io"
|
||||
"os"
|
||||
"regexp"
|
||||
"sort"
|
||||
"strings"
|
||||
|
||||
"github.com/gomarkdown/markdown"
|
||||
"github.com/gomarkdown/markdown/ast"
|
||||
"github.com/google/subcommands"
|
||||
"github.com/hexops/gotextdiff"
|
||||
"github.com/hexops/gotextdiff/myers"
|
||||
"github.com/hexops/gotextdiff/span"
|
||||
)
|
||||
|
||||
type hashtagsCmd struct {
|
||||
@@ -86,7 +87,7 @@ func hashtagsUpdateCli(w io.Writer, dryRun bool) subcommands.ExitStatus {
|
||||
continue
|
||||
}
|
||||
title, ok := namesMap[hashtag]
|
||||
if (!ok) {
|
||||
if !ok {
|
||||
title = hashtagName(namesMap, hashtag, docids)
|
||||
namesMap[hashtag] = title
|
||||
}
|
||||
@@ -133,7 +134,7 @@ func hashtagsUpdateCli(w io.Writer, dryRun bool) subcommands.ExitStatus {
|
||||
}
|
||||
fn := h.Name + ".md"
|
||||
edits := myers.ComputeEdits(span.URIFromPath(fn), original, string(h.Body))
|
||||
diff := fmt.Sprint(gotextdiff.ToUnified(fn + "~", fn, original, edits))
|
||||
diff := fmt.Sprint(gotextdiff.ToUnified(fn+"~", fn, original, edits))
|
||||
fmt.Fprint(w, diff)
|
||||
} else {
|
||||
err = h.save()
|
||||
@@ -149,7 +150,7 @@ func hashtagsUpdateCli(w io.Writer, dryRun bool) subcommands.ExitStatus {
|
||||
|
||||
// Go through all the documents in the same directory and look for hashtag matches in the rendered HTML in order to
|
||||
// determine the most likely capitalization.
|
||||
func hashtagName (namesMap map[string]string, hashtag string, docids []docid) string {
|
||||
func hashtagName(namesMap map[string]string, hashtag string, docids []docid) string {
|
||||
candidate := make(map[string]int)
|
||||
var mostPopular string
|
||||
for _, docid := range docids {
|
||||
@@ -166,8 +167,7 @@ func hashtagName (namesMap map[string]string, hashtag string, docids []docid) st
|
||||
doc := markdown.Parse(p.Body, parser)
|
||||
ast.WalkFunc(doc, func(node ast.Node, entering bool) ast.WalkStatus {
|
||||
if entering {
|
||||
switch v := node.(type) {
|
||||
case *ast.Link:
|
||||
if v, ok := node.(*ast.Link); ok {
|
||||
for _, attr := range v.AdditionalAttributes {
|
||||
if attr == `class="tag"` {
|
||||
tagName := []byte("")
|
||||
@@ -181,7 +181,7 @@ func hashtagName (namesMap map[string]string, hashtag string, docids []docid) st
|
||||
if strings.EqualFold(hashtag, strings.ReplaceAll(tag, " ", "_")) {
|
||||
_, ok := candidate[tag]
|
||||
if ok {
|
||||
candidate[tag] += 1
|
||||
candidate[tag]++
|
||||
} else {
|
||||
candidate[tag] = 1
|
||||
}
|
||||
|
||||
@@ -2,9 +2,10 @@ package main
|
||||
|
||||
import (
|
||||
"bytes"
|
||||
"testing"
|
||||
|
||||
"github.com/google/subcommands"
|
||||
"github.com/stretchr/testify/assert"
|
||||
"testing"
|
||||
)
|
||||
|
||||
func TestHashtagsCmd(t *testing.T) {
|
||||
|
||||
41
html_cmd.go
41
html_cmd.go
@@ -4,10 +4,12 @@ import (
|
||||
"context"
|
||||
"flag"
|
||||
"fmt"
|
||||
"github.com/google/subcommands"
|
||||
"html/template"
|
||||
"io"
|
||||
"os"
|
||||
"strings"
|
||||
|
||||
"github.com/google/subcommands"
|
||||
)
|
||||
|
||||
type htmlCmd struct {
|
||||
@@ -40,20 +42,20 @@ func htmlCli(w io.Writer, template string, args []string) subcommands.ExitStatus
|
||||
return subcommands.ExitFailure
|
||||
}
|
||||
p := &Page{Name: "stdin", Body: body}
|
||||
return p.printHtml(w, template)
|
||||
return p.printHTML(w, template)
|
||||
}
|
||||
for _, name := range args {
|
||||
if !strings.HasSuffix(name, ".md") {
|
||||
fmt.Fprintf(os.Stderr, "%s does not end in '.md'\n", name)
|
||||
return subcommands.ExitFailure
|
||||
}
|
||||
name = name[0:len(name)-3]
|
||||
name = name[0 : len(name)-3]
|
||||
p, err := loadPage(name)
|
||||
if err != nil {
|
||||
fmt.Fprintf(os.Stderr, "Cannot load %s: %s\n", name, err)
|
||||
return subcommands.ExitFailure
|
||||
}
|
||||
status := p.printHtml(w, template)
|
||||
status := p.printHTML(w, template)
|
||||
if status != subcommands.ExitSuccess {
|
||||
return status
|
||||
}
|
||||
@@ -61,21 +63,28 @@ func htmlCli(w io.Writer, template string, args []string) subcommands.ExitStatus
|
||||
return subcommands.ExitSuccess
|
||||
}
|
||||
|
||||
func (p *Page) printHtml(w io.Writer, template string) subcommands.ExitStatus {
|
||||
if len(template) > 0 {
|
||||
t := template
|
||||
loadTemplates()
|
||||
p.handleTitle(true)
|
||||
p.renderHtml()
|
||||
err := templates.template[t].Execute(w, p)
|
||||
func (p *Page) printHTML(w io.Writer, fn string) subcommands.ExitStatus {
|
||||
if fn == "" {
|
||||
// do not handle title
|
||||
p.renderHTML()
|
||||
_, err := fmt.Fprintln(w, p.HTML)
|
||||
if err != nil {
|
||||
fmt.Fprintf(os.Stderr, "Cannot execute %s template for %s: %s\n", t, p.Name, err)
|
||||
fmt.Fprintf(os.Stderr, "Cannot write to stdout: %s\n", err)
|
||||
return subcommands.ExitFailure
|
||||
}
|
||||
} else {
|
||||
// do not handle title
|
||||
p.renderHtml()
|
||||
fmt.Fprintln(w, p.Html)
|
||||
return subcommands.ExitSuccess
|
||||
}
|
||||
p.handleTitle(true)
|
||||
p.renderHTML()
|
||||
t, err := template.ParseFiles(fn)
|
||||
if err != nil {
|
||||
fmt.Fprintf(os.Stderr, "Cannot parse template %s for %s: %s\n", fn, p.Name, err)
|
||||
return subcommands.ExitFailure
|
||||
}
|
||||
err = t.Execute(w, p)
|
||||
if err != nil {
|
||||
fmt.Fprintf(os.Stderr, "Cannot execute template %s for %s: %s\n", fn, p.Name, err)
|
||||
return subcommands.ExitFailure
|
||||
}
|
||||
return subcommands.ExitSuccess
|
||||
}
|
||||
|
||||
@@ -2,9 +2,10 @@ package main
|
||||
|
||||
import (
|
||||
"bytes"
|
||||
"testing"
|
||||
|
||||
"github.com/google/subcommands"
|
||||
"github.com/stretchr/testify/assert"
|
||||
"testing"
|
||||
)
|
||||
|
||||
func TestHtmlCmd(t *testing.T) {
|
||||
|
||||
25
index.go
25
index.go
@@ -5,7 +5,6 @@
|
||||
package main
|
||||
|
||||
import (
|
||||
"golang.org/x/exp/constraints"
|
||||
"html/template"
|
||||
"io/fs"
|
||||
"log"
|
||||
@@ -13,6 +12,8 @@ import (
|
||||
"sort"
|
||||
"strings"
|
||||
"sync"
|
||||
|
||||
"golang.org/x/exp/constraints"
|
||||
)
|
||||
|
||||
type docid uint
|
||||
@@ -23,15 +24,15 @@ type docid uint
|
||||
// It depends on the fact that Title is always plain text.
|
||||
type ImageData struct {
|
||||
Title, Name string
|
||||
Html template.HTML
|
||||
HTML template.HTML
|
||||
}
|
||||
|
||||
// indexStore controls access to the maps used for search. Make sure to lock and unlock as appropriate.
|
||||
type indexStore struct {
|
||||
sync.RWMutex
|
||||
|
||||
// next_id is the number of the next document added to the index
|
||||
next_id docid
|
||||
// nextID is the number of the next document added to the index
|
||||
nextID docid
|
||||
|
||||
// index is an inverted index mapping tokens to document ids.
|
||||
token map[string][]docid
|
||||
@@ -54,7 +55,7 @@ func init() {
|
||||
|
||||
// reset the index. This assumes that the index is locked. It's useful for tests.
|
||||
func (idx *indexStore) reset() {
|
||||
idx.next_id = 0
|
||||
idx.nextID = 0
|
||||
idx.token = make(map[string][]docid)
|
||||
idx.documents = make(map[docid]string)
|
||||
idx.titles = make(map[string]string)
|
||||
@@ -64,8 +65,8 @@ func (idx *indexStore) reset() {
|
||||
// addDocument adds the text as a new document. This assumes that the index is locked!
|
||||
// The hashtags (only!) are used as tokens. They are stored in lower case.
|
||||
func (idx *indexStore) addDocument(text []byte) docid {
|
||||
id := idx.next_id
|
||||
idx.next_id++
|
||||
id := idx.nextID
|
||||
idx.nextID++
|
||||
for _, token := range hashtags(text) {
|
||||
token = strings.ToLower(token)
|
||||
ids := idx.token[token]
|
||||
@@ -147,9 +148,8 @@ func (idx *indexStore) walk(fp string, info fs.FileInfo, err error) error {
|
||||
if fp != "." && strings.HasPrefix(filepath.Base(fp), ".") {
|
||||
if info.IsDir() {
|
||||
return filepath.SkipDir
|
||||
} else {
|
||||
return nil
|
||||
}
|
||||
return nil
|
||||
}
|
||||
// skipp all but page files
|
||||
if !strings.HasSuffix(fp, ".md") {
|
||||
@@ -238,11 +238,12 @@ func intersection[T constraints.Ordered](a []T, b []T) []T {
|
||||
r := make([]T, 0, maxLen)
|
||||
var i, j int
|
||||
for i < len(a) && j < len(b) {
|
||||
if a[i] < b[j] {
|
||||
switch {
|
||||
case a[i] < b[j]:
|
||||
i++
|
||||
} else if a[i] > b[j] {
|
||||
case a[i] > b[j]:
|
||||
j++
|
||||
} else {
|
||||
default:
|
||||
r = append(r, a[i])
|
||||
i++
|
||||
j++
|
||||
|
||||
@@ -1,9 +1,10 @@
|
||||
package main
|
||||
|
||||
import (
|
||||
"github.com/stretchr/testify/assert"
|
||||
"strings"
|
||||
"testing"
|
||||
|
||||
"github.com/stretchr/testify/assert"
|
||||
)
|
||||
|
||||
func TestIndexAdd(t *testing.T) {
|
||||
|
||||
@@ -2,9 +2,10 @@ package main
|
||||
|
||||
import (
|
||||
"errors"
|
||||
"github.com/pemistahl/lingua-go"
|
||||
"os"
|
||||
"strings"
|
||||
|
||||
"github.com/pemistahl/lingua-go"
|
||||
)
|
||||
|
||||
// getLanguages returns the environment variable ODDMU_LANGUAGES or all languages.
|
||||
|
||||
@@ -1,9 +1,10 @@
|
||||
package main
|
||||
|
||||
import (
|
||||
"github.com/stretchr/testify/assert"
|
||||
"os"
|
||||
"testing"
|
||||
|
||||
"github.com/stretchr/testify/assert"
|
||||
)
|
||||
|
||||
func TestAllLanguage(t *testing.T) {
|
||||
|
||||
@@ -4,10 +4,11 @@ import (
|
||||
"context"
|
||||
"flag"
|
||||
"fmt"
|
||||
"github.com/google/subcommands"
|
||||
"io"
|
||||
"os"
|
||||
"strings"
|
||||
|
||||
"github.com/google/subcommands"
|
||||
)
|
||||
|
||||
type linksCmd struct {
|
||||
@@ -48,7 +49,7 @@ func linksCli(w io.Writer, args []string) subcommands.ExitStatus {
|
||||
fmt.Fprintf(os.Stderr, "%s does not end in '.md'\n", name)
|
||||
return subcommands.ExitFailure
|
||||
}
|
||||
name = name[0:len(name)-3]
|
||||
name = name[0 : len(name)-3]
|
||||
p, err := loadPage(name)
|
||||
if err != nil {
|
||||
fmt.Fprintf(w, "Loading %s: %s\n", name, err)
|
||||
|
||||
@@ -2,9 +2,10 @@ package main
|
||||
|
||||
import (
|
||||
"bytes"
|
||||
"testing"
|
||||
|
||||
"github.com/google/subcommands"
|
||||
"github.com/stretchr/testify/assert"
|
||||
"testing"
|
||||
)
|
||||
|
||||
func TestLinksCmd(t *testing.T) {
|
||||
|
||||
@@ -4,11 +4,12 @@ import (
|
||||
"context"
|
||||
"flag"
|
||||
"fmt"
|
||||
"github.com/google/subcommands"
|
||||
"io"
|
||||
"os"
|
||||
"path/filepath"
|
||||
"strings"
|
||||
|
||||
"github.com/google/subcommands"
|
||||
)
|
||||
|
||||
type listCmd struct {
|
||||
|
||||
@@ -2,9 +2,10 @@ package main
|
||||
|
||||
import (
|
||||
"bytes"
|
||||
"testing"
|
||||
|
||||
"github.com/google/subcommands"
|
||||
"github.com/stretchr/testify/assert"
|
||||
"testing"
|
||||
)
|
||||
|
||||
func TestListCmd(t *testing.T) {
|
||||
|
||||
@@ -43,13 +43,13 @@ README.md: ../README.md
|
||||
< $< > $@
|
||||
|
||||
upload: ${MD} README.md
|
||||
rsync --itemize-changes --archive *.md sibirocobombus:alexschroeder.ch/wiki/oddmu/
|
||||
rsync --itemize-changes --archive *.md ../README.md sibirocobombus:alexschroeder.ch/wiki/oddmu/
|
||||
make clean
|
||||
|
||||
clean:
|
||||
@echo Removing HTML and Markdown files
|
||||
@rm --force ${HTML} ${MD} README.md
|
||||
@rm -f ${HTML} ${MD} README.md
|
||||
|
||||
realclean: clean
|
||||
@echo Removing man pages
|
||||
@rm --force ${MAN}
|
||||
@rm -f ${MAN}
|
||||
|
||||
@@ -5,7 +5,7 @@
|
||||
.nh
|
||||
.ad l
|
||||
.\" Begin generated content:
|
||||
.TH "ODDMU-APACHE" "5" "2025-07-16"
|
||||
.TH "ODDMU-APACHE" "5" "2026-01-29"
|
||||
.PP
|
||||
.SH NAME
|
||||
.PP
|
||||
@@ -48,7 +48,7 @@ ServerAdmin alex@alexschroeder\&.ch
|
||||
<VirtualHost *:443>
|
||||
ServerName transjovian\&.org
|
||||
SSLEngine on
|
||||
ProxyPassMatch "^/((view|preview|diff|edit|save|add|append|upload|drop|search|archive)/(\&.*))?$"
|
||||
ProxyPassMatch "^/((view|preview|diff|edit|save|add|append|upload|drop|search|archive)/(\&.*)|sitemap.xml)?$"
|
||||
"http://localhost:8080/$1"
|
||||
</VirtualHost>
|
||||
.fi
|
||||
@@ -126,13 +126,13 @@ ServerAdmin alex@alexschroeder\&.ch
|
||||
ServerName transjovian\&.org
|
||||
ProxyPassMatch "^/((view|diff|search|archive)/(\&.*))?$"
|
||||
"http://localhost:8080/$1"
|
||||
RedirectMatch "^/((edit|save|add|append|upload|drop)/(\&.*))?$"
|
||||
RedirectMatch "^/((edit|save|add|append|upload|drop)/(\&.*)|sitemap.xml)?$"
|
||||
"https://transjovian\&.org/$1"
|
||||
</VirtualHost>
|
||||
<VirtualHost *:443>
|
||||
ServerName transjovian\&.org
|
||||
SSLEngine on
|
||||
ProxyPassMatch "^/((view|preview|diff|edit|save|add|append|upload|drop|search|archive)/(\&.*))?$"
|
||||
ProxyPassMatch "^/((view|preview|diff|edit|save|add|append|upload|drop|search|archive)/(\&.*)|sitemap.xml)?$"
|
||||
"http://localhost:8080/$1"
|
||||
</VirtualHost>
|
||||
.fi
|
||||
@@ -170,7 +170,7 @@ In that case, you need to use the ProxyPassMatch directive.\&
|
||||
.PP
|
||||
.nf
|
||||
.RS 4
|
||||
ProxyPassMatch "^/((view|preview|diff|edit|save|add|append|upload|drop|search|archive)/(\&.*))?$"
|
||||
ProxyPassMatch "^/((view|preview|diff|edit|save|add|append|upload|drop|search|archive)/(\&.*)|sitemap.xml)?$"
|
||||
"unix:/run/oddmu/oddmu\&.sock|http://localhost/$1"
|
||||
.fi
|
||||
.RE
|
||||
@@ -189,7 +189,7 @@ A workaround is to add the redirect manually and drop the question-mark:
|
||||
.nf
|
||||
.RS 4
|
||||
RedirectMatch "^/$" "/view/index"
|
||||
ProxyPassMatch "^/((view|preview|diff|edit|save|add|append|upload|drop|search|archive)/(\&.*))$"
|
||||
ProxyPassMatch "^/((view|preview|diff|edit|save|add|append|upload|drop|search|archive)/(\&.*)|sitemap.xml)$"
|
||||
"unix:/run/oddmu/oddmu\&.sock|http://localhost/$1"
|
||||
.fi
|
||||
.RE
|
||||
@@ -248,12 +248,74 @@ to your "<VirtualHost *:443>" section:
|
||||
.fi
|
||||
.RE
|
||||
.PP
|
||||
.SS Actual usernames and passwords for authentication
|
||||
.PP
|
||||
On a community server where the users have accounts, wiki editing can be limited
|
||||
to the system'\&s users.\&
|
||||
.PP
|
||||
In order to do this, install the \fBmod-authnz-external\fR module for Apache and the
|
||||
\fBpwauth\fR binary.\& The module allows the password checking normally done inside
|
||||
Apache to be done by an separate external program running outside of Apache.\&
|
||||
.PP
|
||||
Here'\&s an example configuration:
|
||||
.PP
|
||||
.nf
|
||||
.RS 4
|
||||
AddExternalAuth pwauth /usr/sbin/pwauth
|
||||
SetExternalAuthMethod pwauth pipe
|
||||
|
||||
<LocationMatch "^/(edit|save|add|append|upload|drop)/">
|
||||
AuthType Basic
|
||||
AuthName "Password Required"
|
||||
AuthBasicProvider external
|
||||
AuthExternal pwauth
|
||||
Require valid-user
|
||||
</LocationMatch>
|
||||
.fi
|
||||
.RE
|
||||
.PP
|
||||
.SS Different logins for different access rights
|
||||
.PP
|
||||
What if you have a site with various subdirectories and each subdirectory is for
|
||||
a different group of friends?\& You can set this up using your webserver.\& One way
|
||||
to do this is to require specific usernames (which must have a password in the
|
||||
password file mentioned above.\&
|
||||
.PP
|
||||
This requires a valid login by the user "alex" or "berta":
|
||||
.PP
|
||||
.nf
|
||||
.RS 4
|
||||
<LocationMatch "^/(edit|save|add|append|upload|drop)/intetebi/">
|
||||
Require user alex berta
|
||||
</LocationMatch>
|
||||
.fi
|
||||
.RE
|
||||
.PP
|
||||
.SS Private wikis
|
||||
.PP
|
||||
Based on the above, you can prevent people from \fIreading\fR the wiki.\& The location
|
||||
must cover all the URLs in order to protect everything.\&
|
||||
.PP
|
||||
.nf
|
||||
.RS 4
|
||||
<Location />
|
||||
AuthType Basic
|
||||
AuthName "Password Required"
|
||||
AuthUserFile /home/oddmu/\&.htpasswd
|
||||
Require valid-user
|
||||
</Location>
|
||||
.fi
|
||||
.RE
|
||||
.PP
|
||||
.SS Subdirectories as separate sites
|
||||
.PP
|
||||
The way Oddmu handles subdirectories is that all files and directories are
|
||||
visible, except for "hidden" files and directories (whose name starts with a
|
||||
period).\& Specifically, do not rely on Apache to hide locations in subdirectories
|
||||
from public view.\& Search reveals the existence of these pages and produces an
|
||||
extract, even if users cannot follow the links.\& Archive links pack all the
|
||||
subdirectories, including locations you may have hidden from view using Apache.\&
|
||||
extract, even if users cannot follow the links.\& The Sitemap lists all pages,
|
||||
including subdirectories.\& Archive links pack all the subdirectories, including
|
||||
locations you may have hidden from view using Apache.\&
|
||||
.PP
|
||||
If you to treat subdirectories as separate sites, you need to set the
|
||||
environment variable ODDMU_FILTER to a regular expression matching the those
|
||||
@@ -338,46 +400,13 @@ In this case, "/css/oddmu-2023.\&css" would be the name of your stylesheet.\& If
|
||||
your document root is "/home/oddmu", then the filename of your stylesheet would
|
||||
have to be "/home/oddmu/css/oddmu-2023.\&css" for this to work.\&
|
||||
.PP
|
||||
.SS Different logins for different access rights
|
||||
.PP
|
||||
What if you have a site with various subdirectories and each subdirectory is for
|
||||
a different group of friends?\& You can set this up using your webserver.\& One way
|
||||
to do this is to require specific usernames (which must have a password in the
|
||||
password file mentioned above.\&
|
||||
.PP
|
||||
This requires a valid login by the user "alex" or "berta":
|
||||
.PP
|
||||
.nf
|
||||
.RS 4
|
||||
<LocationMatch "^/(edit|save|add|append|upload|drop)/intetebi/">
|
||||
Require user alex berta
|
||||
</LocationMatch>
|
||||
.fi
|
||||
.RE
|
||||
.PP
|
||||
.SS Private wikis
|
||||
.PP
|
||||
Based on the above, you can prevent people from \fIreading\fR the wiki.\& The location
|
||||
must cover all the URLs in order to protect everything.\&
|
||||
.PP
|
||||
.nf
|
||||
.RS 4
|
||||
<Location />
|
||||
AuthType Basic
|
||||
AuthName "Password Required"
|
||||
AuthUserFile /home/oddmu/\&.htpasswd
|
||||
Require valid-user
|
||||
</Location>
|
||||
.fi
|
||||
.RE
|
||||
.PP
|
||||
.SS Virtual hosting
|
||||
.PP
|
||||
Virtual hosting in this context means that the program serves two different
|
||||
sites for two different domains from the same machine.\& Oddmu doesn'\&t support
|
||||
that, but your webserver does.\& Therefore, start an Oddmu instance for every
|
||||
domain name, each listening on a different port.\& Then set up your web server
|
||||
such that ever domain acts as a reverse proxy to a different Oddmu instance.\&
|
||||
such that every domain proxies for a different Oddmu instance.\&
|
||||
.PP
|
||||
.SH SEE ALSO
|
||||
.PP
|
||||
|
||||
@@ -40,7 +40,7 @@ ServerAdmin alex@alexschroeder.ch
|
||||
<VirtualHost *:443>
|
||||
ServerName transjovian.org
|
||||
SSLEngine on
|
||||
ProxyPassMatch "^/((view|preview|diff|edit|save|add|append|upload|drop|search|archive)/(.*))?$" \
|
||||
ProxyPassMatch "^/((view|preview|diff|edit|save|add|append|upload|drop|search|archive)/(.*)|sitemap\.xml)?$" \
|
||||
"http://localhost:8080/$1"
|
||||
</VirtualHost>
|
||||
```
|
||||
@@ -106,13 +106,13 @@ ServerAdmin alex@alexschroeder.ch
|
||||
ServerName transjovian.org
|
||||
ProxyPassMatch "^/((view|diff|search|archive)/(.*))?$" \
|
||||
"http://localhost:8080/$1"
|
||||
RedirectMatch "^/((edit|save|add|append|upload|drop)/(.*))?$" \
|
||||
RedirectMatch "^/((edit|save|add|append|upload|drop)/(.*)|sitemap\.xml)?$" \
|
||||
"https://transjovian.org/$1"
|
||||
</VirtualHost>
|
||||
<VirtualHost *:443>
|
||||
ServerName transjovian.org
|
||||
SSLEngine on
|
||||
ProxyPassMatch "^/((view|preview|diff|edit|save|add|append|upload|drop|search|archive)/(.*))?$" \
|
||||
ProxyPassMatch "^/((view|preview|diff|edit|save|add|append|upload|drop|search|archive)/(.*)|sitemap\.xml)?$" \
|
||||
"http://localhost:8080/$1"
|
||||
</VirtualHost>
|
||||
```
|
||||
@@ -144,7 +144,7 @@ You probably want to serve some static files as well (see *Serve static files*).
|
||||
In that case, you need to use the ProxyPassMatch directive.
|
||||
|
||||
```
|
||||
ProxyPassMatch "^/((view|preview|diff|edit|save|add|append|upload|drop|search|archive)/(.*))?$" \
|
||||
ProxyPassMatch "^/((view|preview|diff|edit|save|add|append|upload|drop|search|archive)/(.*)|sitemap\.xml)?$" \
|
||||
"unix:/run/oddmu/oddmu.sock|http://localhost/$1"
|
||||
```
|
||||
|
||||
@@ -159,7 +159,7 @@ A workaround is to add the redirect manually and drop the question-mark:
|
||||
|
||||
```
|
||||
RedirectMatch "^/$" "/view/index"
|
||||
ProxyPassMatch "^/((view|preview|diff|edit|save|add|append|upload|drop|search|archive)/(.*))$" \
|
||||
ProxyPassMatch "^/((view|preview|diff|edit|save|add|append|upload|drop|search|archive)/(.*)|sitemap\.xml)$" \
|
||||
"unix:/run/oddmu/oddmu.sock|http://localhost/$1"
|
||||
```
|
||||
|
||||
@@ -209,12 +209,68 @@ to your "<VirtualHost \*:443>" section:
|
||||
</LocationMatch>
|
||||
```
|
||||
|
||||
## Actual usernames and passwords for authentication
|
||||
|
||||
On a community server where the users have accounts, wiki editing can be limited
|
||||
to the system's users.
|
||||
|
||||
In order to do this, install the *mod-authnz-external* module for Apache and the
|
||||
*pwauth* binary. The module allows the password checking normally done inside
|
||||
Apache to be done by an separate external program running outside of Apache.
|
||||
|
||||
Here's an example configuration:
|
||||
|
||||
```
|
||||
AddExternalAuth pwauth /usr/sbin/pwauth
|
||||
SetExternalAuthMethod pwauth pipe
|
||||
|
||||
<LocationMatch "^/(edit|save|add|append|upload|drop)/">
|
||||
AuthType Basic
|
||||
AuthName "Password Required"
|
||||
AuthBasicProvider external
|
||||
AuthExternal pwauth
|
||||
Require valid-user
|
||||
</LocationMatch>
|
||||
```
|
||||
|
||||
## Different logins for different access rights
|
||||
|
||||
What if you have a site with various subdirectories and each subdirectory is for
|
||||
a different group of friends? You can set this up using your webserver. One way
|
||||
to do this is to require specific usernames (which must have a password in the
|
||||
password file mentioned above.
|
||||
|
||||
This requires a valid login by the user "alex" or "berta":
|
||||
|
||||
```
|
||||
<LocationMatch "^/(edit|save|add|append|upload|drop)/intetebi/">
|
||||
Require user alex berta
|
||||
</LocationMatch>
|
||||
```
|
||||
|
||||
## Private wikis
|
||||
|
||||
Based on the above, you can prevent people from _reading_ the wiki. The location
|
||||
must cover all the URLs in order to protect everything.
|
||||
|
||||
```
|
||||
<Location />
|
||||
AuthType Basic
|
||||
AuthName "Password Required"
|
||||
AuthUserFile /home/oddmu/.htpasswd
|
||||
Require valid-user
|
||||
</Location>
|
||||
```
|
||||
|
||||
## Subdirectories as separate sites
|
||||
|
||||
The way Oddmu handles subdirectories is that all files and directories are
|
||||
visible, except for "hidden" files and directories (whose name starts with a
|
||||
period). Specifically, do not rely on Apache to hide locations in subdirectories
|
||||
from public view. Search reveals the existence of these pages and produces an
|
||||
extract, even if users cannot follow the links. Archive links pack all the
|
||||
subdirectories, including locations you may have hidden from view using Apache.
|
||||
extract, even if users cannot follow the links. The Sitemap lists all pages,
|
||||
including subdirectories. Archive links pack all the subdirectories, including
|
||||
locations you may have hidden from view using Apache.
|
||||
|
||||
If you to treat subdirectories as separate sites, you need to set the
|
||||
environment variable ODDMU_FILTER to a regular expression matching the those
|
||||
@@ -291,42 +347,13 @@ In this case, "/css/oddmu-2023.css" would be the name of your stylesheet. If
|
||||
your document root is "/home/oddmu", then the filename of your stylesheet would
|
||||
have to be "/home/oddmu/css/oddmu-2023.css" for this to work.
|
||||
|
||||
## Different logins for different access rights
|
||||
|
||||
What if you have a site with various subdirectories and each subdirectory is for
|
||||
a different group of friends? You can set this up using your webserver. One way
|
||||
to do this is to require specific usernames (which must have a password in the
|
||||
password file mentioned above.
|
||||
|
||||
This requires a valid login by the user "alex" or "berta":
|
||||
|
||||
```
|
||||
<LocationMatch "^/(edit|save|add|append|upload|drop)/intetebi/">
|
||||
Require user alex berta
|
||||
</LocationMatch>
|
||||
```
|
||||
|
||||
## Private wikis
|
||||
|
||||
Based on the above, you can prevent people from _reading_ the wiki. The location
|
||||
must cover all the URLs in order to protect everything.
|
||||
|
||||
```
|
||||
<Location />
|
||||
AuthType Basic
|
||||
AuthName "Password Required"
|
||||
AuthUserFile /home/oddmu/.htpasswd
|
||||
Require valid-user
|
||||
</Location>
|
||||
```
|
||||
|
||||
## Virtual hosting
|
||||
|
||||
Virtual hosting in this context means that the program serves two different
|
||||
sites for two different domains from the same machine. Oddmu doesn't support
|
||||
that, but your webserver does. Therefore, start an Oddmu instance for every
|
||||
domain name, each listening on a different port. Then set up your web server
|
||||
such that ever domain acts as a reverse proxy to a different Oddmu instance.
|
||||
such that every domain proxies for a different Oddmu instance.
|
||||
|
||||
# SEE ALSO
|
||||
|
||||
|
||||
@@ -5,7 +5,7 @@
|
||||
.nh
|
||||
.ad l
|
||||
.\" Begin generated content:
|
||||
.TH "ODDMU-EXPORT" "1" "2024-08-29"
|
||||
.TH "ODDMU-EXPORT" "1" "2026-01-03"
|
||||
.PP
|
||||
.SH NAME
|
||||
.PP
|
||||
@@ -22,8 +22,8 @@ You probably want to redirect this into a file so that you can upload and import
|
||||
it somewhere.\&
|
||||
.PP
|
||||
Note that this only handles pages (Markdown files).\& All other files (images,
|
||||
PDFs, whatever else you uploaded) are not part of the feed and has to be
|
||||
uploaded to the new platform in some other way.\&
|
||||
PDFs, whatever else you uploaded) are not part of the feed and have to be
|
||||
uploaded to the new platform using some other way.\&
|
||||
.PP
|
||||
The \fB-template\fR option specifies the template to use.\& If the template filename
|
||||
ends in \fI.\&xml\fR, \fI.\&html\fR or \fI.\&rss\fR, it is assumed to contain XML and the optional
|
||||
@@ -31,7 +31,8 @@ XML preamble is printed and appropriate escaping rules are used.\&
|
||||
.PP
|
||||
.SH FILES
|
||||
.PP
|
||||
By default, the export uses the \fB\fRfeed.\&html\fB\fR template in the current directory.\&
|
||||
By default, the export uses the feed template ("feed.\&html") in the current
|
||||
directory.\&
|
||||
.PP
|
||||
.SH EXAMPLES
|
||||
.PP
|
||||
|
||||
@@ -15,8 +15,8 @@ You probably want to redirect this into a file so that you can upload and import
|
||||
it somewhere.
|
||||
|
||||
Note that this only handles pages (Markdown files). All other files (images,
|
||||
PDFs, whatever else you uploaded) are not part of the feed and has to be
|
||||
uploaded to the new platform in some other way.
|
||||
PDFs, whatever else you uploaded) are not part of the feed and have to be
|
||||
uploaded to the new platform using some other way.
|
||||
|
||||
The *-template* option specifies the template to use. If the template filename
|
||||
ends in _.xml_, _.html_ or _.rss_, it is assumed to contain XML and the optional
|
||||
@@ -24,7 +24,8 @@ XML preamble is printed and appropriate escaping rules are used.
|
||||
|
||||
# FILES
|
||||
|
||||
By default, the export uses the **feed.html** template in the current directory.
|
||||
By default, the export uses the feed template ("feed.html") in the current
|
||||
directory.
|
||||
|
||||
# EXAMPLES
|
||||
|
||||
|
||||
59
man/oddmu-feed.1
Normal file
59
man/oddmu-feed.1
Normal file
@@ -0,0 +1,59 @@
|
||||
.\" Generated by scdoc 1.11.3
|
||||
.\" Complete documentation for this program is not available as a GNU info page
|
||||
.ie \n(.g .ds Aq \(aq
|
||||
.el .ds Aq '
|
||||
.nh
|
||||
.ad l
|
||||
.\" Begin generated content:
|
||||
.TH "ODDMU-FEED" "1" "2025-12-31"
|
||||
.PP
|
||||
.SH NAME
|
||||
.PP
|
||||
oddmu-feed - render Oddmu page feed
|
||||
.PP
|
||||
.SH SYNOPSIS
|
||||
.PP
|
||||
\fBoddmu feed\fR \fIpage-name\fR .\&.\&.\&
|
||||
.PP
|
||||
.SH DESCRIPTION
|
||||
.PP
|
||||
The "feed" subcommand opens the given Markdown files and writes the resulting
|
||||
RSS files without item limit (ordinarily, this default is 10 items per feed).\&
|
||||
This uses the "feed.\&html" template.\& Use "-" as the page name if you want to read
|
||||
Markdown from \fBstdin\fR.\&
|
||||
.PP
|
||||
Unlike the feeds generated by the \fBstatic\fR subcommand, the \fBfeed\fR command does
|
||||
not limit the feed to the ten most recent items.\& Instead, all items on the list
|
||||
are turned into feed items.\&
|
||||
.PP
|
||||
Furthermore, if the items on the list are blog posts (their page name starts
|
||||
with an ISO date), then this ISO date is used for the last update date to the
|
||||
page instead of the last modification time of the file.\& The idea, more or less,
|
||||
is that this feed is an archive feed and that in this context the creation date
|
||||
is more important than the last modification date.\&
|
||||
.PP
|
||||
.SH EXAMPLES
|
||||
.PP
|
||||
Generate "emacs.\&rss" from "emacs.\&md":
|
||||
.PP
|
||||
.nf
|
||||
.RS 4
|
||||
oddmu feed emacs\&.md
|
||||
.fi
|
||||
.RE
|
||||
.PP
|
||||
Alternatively:
|
||||
.PP
|
||||
.nf
|
||||
.RS 4
|
||||
oddmu feed - < emacs\&.md > emacs\&.rss
|
||||
.fi
|
||||
.RE
|
||||
.PP
|
||||
.SH SEE ALSO
|
||||
.PP
|
||||
\fIoddmu\fR(1), \fIoddmu-export\fR(1), \fIoddmu-static\fR(1)
|
||||
.PP
|
||||
.SH AUTHORS
|
||||
.PP
|
||||
Maintained by Alex Schroeder <alex@gnu.\&org>.\&
|
||||
48
man/oddmu-feed.1.txt
Normal file
48
man/oddmu-feed.1.txt
Normal file
@@ -0,0 +1,48 @@
|
||||
ODDMU-FEED(1)
|
||||
|
||||
# NAME
|
||||
|
||||
oddmu-feed - render Oddmu page feed
|
||||
|
||||
# SYNOPSIS
|
||||
|
||||
*oddmu feed* _page-name_ ...
|
||||
|
||||
# DESCRIPTION
|
||||
|
||||
The "feed" subcommand opens the given Markdown files and writes the resulting
|
||||
RSS files without item limit (ordinarily, this default is 10 items per feed).
|
||||
This uses the "feed.html" template. Use "-" as the page name if you want to read
|
||||
Markdown from *stdin*.
|
||||
|
||||
Unlike the feeds generated by the *static* subcommand, the *feed* command does
|
||||
not limit the feed to the ten most recent items. Instead, all items on the list
|
||||
are turned into feed items.
|
||||
|
||||
Furthermore, if the items on the list are blog posts (their page name starts
|
||||
with an ISO date), then this ISO date is used for the last update date to the
|
||||
page instead of the last modification time of the file. The idea, more or less,
|
||||
is that this feed is an archive feed and that in this context the creation date
|
||||
is more important than the last modification date.
|
||||
|
||||
# EXAMPLES
|
||||
|
||||
Generate "emacs.rss" from "emacs.md":
|
||||
|
||||
```
|
||||
oddmu feed emacs.md
|
||||
```
|
||||
|
||||
Alternatively:
|
||||
|
||||
```
|
||||
oddmu feed - < emacs.md > emacs.rss
|
||||
```
|
||||
|
||||
# SEE ALSO
|
||||
|
||||
_oddmu_(1), _oddmu-export_(1), _oddmu-static_(1)
|
||||
|
||||
# AUTHORS
|
||||
|
||||
Maintained by Alex Schroeder <alex@gnu.org>.
|
||||
@@ -5,7 +5,7 @@
|
||||
.nh
|
||||
.ad l
|
||||
.\" Begin generated content:
|
||||
.TH "ODDMU-FILTER" "7" "2024-09-30"
|
||||
.TH "ODDMU-FILTER" "7" "2026-01-03"
|
||||
.PP
|
||||
.SH NAME
|
||||
.PP
|
||||
@@ -13,13 +13,13 @@ oddmu-filter - keeping subdirectories separate
|
||||
.PP
|
||||
.SH DESCRIPTION
|
||||
.PP
|
||||
There are actions such as searching and archiving that act on multiple pages,
|
||||
not just a single page.\& These actions walk the directory tree, including all
|
||||
subdirectories.\& In some cases, this is not desirable.\&
|
||||
There are actions such as producing the sitemap, searching and archiving that
|
||||
act on multiple pages, not just a single page.\& These actions walk the directory
|
||||
tree, including all subdirectories.\& In some cases, this is not desirable.\&
|
||||
.PP
|
||||
Sometimes, subdirectories are separate sites, like the sites of other projects
|
||||
or different people.\& Depending on how you think about it, you might not want to
|
||||
include those "sites" in searches or archives of the whole site.\&
|
||||
include those "sites" in searches, sitemaps or archives of the whole site.\&
|
||||
.PP
|
||||
Since directory tree actions always start in the directory the visitor is
|
||||
currently looking at, directory tree actions starting in a "separate site"
|
||||
|
||||
@@ -6,13 +6,13 @@ oddmu-filter - keeping subdirectories separate
|
||||
|
||||
# DESCRIPTION
|
||||
|
||||
There are actions such as searching and archiving that act on multiple pages,
|
||||
not just a single page. These actions walk the directory tree, including all
|
||||
subdirectories. In some cases, this is not desirable.
|
||||
There are actions such as producing the sitemap, searching and archiving that
|
||||
act on multiple pages, not just a single page. These actions walk the directory
|
||||
tree, including all subdirectories. In some cases, this is not desirable.
|
||||
|
||||
Sometimes, subdirectories are separate sites, like the sites of other projects
|
||||
or different people. Depending on how you think about it, you might not want to
|
||||
include those "sites" in searches or archives of the whole site.
|
||||
include those "sites" in searches, sitemaps or archives of the whole site.
|
||||
|
||||
Since directory tree actions always start in the directory the visitor is
|
||||
currently looking at, directory tree actions starting in a "separate site"
|
||||
|
||||
@@ -5,7 +5,7 @@
|
||||
.nh
|
||||
.ad l
|
||||
.\" Begin generated content:
|
||||
.TH "ODDMU-HTML" "1" "2025-04-05"
|
||||
.TH "ODDMU-HTML" "1" "2026-01-03"
|
||||
.PP
|
||||
.SH NAME
|
||||
.PP
|
||||
@@ -13,7 +13,7 @@ oddmu-html - render Oddmu page HTML
|
||||
.PP
|
||||
.SH SYNOPSIS
|
||||
.PP
|
||||
\fBoddmu html\fR [\fB\fR-template\fB\fR \fItemplate-name\fR] \fIpage-name\fR
|
||||
\fBoddmu html\fR [\fB-template\fR \fItemplate-name\fR] \fIpage-name\fR
|
||||
.PP
|
||||
.SH DESCRIPTION
|
||||
.PP
|
||||
@@ -23,7 +23,7 @@ name if you want to read Markdown from \fBstdin\fR.\&
|
||||
.PP
|
||||
.SH OPTIONS
|
||||
.PP
|
||||
\fB\fR-template\fB\fR \fItemplate-name\fR
|
||||
\fB-template\fR \fItemplate-name\fR
|
||||
.RS 4
|
||||
Use the given template to render the page.\& Without this, the HTML lacks
|
||||
html and body tags.\& The only two options that make sense are "view.\&html"
|
||||
|
||||
@@ -6,7 +6,7 @@ oddmu-html - render Oddmu page HTML
|
||||
|
||||
# SYNOPSIS
|
||||
|
||||
*oddmu html* [**-template** _template-name_] _page-name_
|
||||
*oddmu html* [*-template* _template-name_] _page-name_
|
||||
|
||||
# DESCRIPTION
|
||||
|
||||
@@ -16,7 +16,7 @@ name if you want to read Markdown from *stdin*.
|
||||
|
||||
# OPTIONS
|
||||
|
||||
**-template** _template-name_
|
||||
*-template* _template-name_
|
||||
Use the given template to render the page. Without this, the HTML lacks
|
||||
html and body tags. The only two options that make sense are "view.html"
|
||||
and "static.html".
|
||||
|
||||
@@ -5,7 +5,7 @@
|
||||
.nh
|
||||
.ad l
|
||||
.\" Begin generated content:
|
||||
.TH "ODDMU-LIST" "1" "2024-08-29"
|
||||
.TH "ODDMU-LIST" "1" "2025-08-31"
|
||||
.PP
|
||||
.SH NAME
|
||||
.PP
|
||||
|
||||
39
man/oddmu-man.1
Normal file
39
man/oddmu-man.1
Normal file
@@ -0,0 +1,39 @@
|
||||
.\" Generated by scdoc 1.11.3
|
||||
.\" Complete documentation for this program is not available as a GNU info page
|
||||
.ie \n(.g .ds Aq \(aq
|
||||
.el .ds Aq '
|
||||
.nh
|
||||
.ad l
|
||||
.\" Begin generated content:
|
||||
.TH "ODDMU-MAN" "1" "2026-02-11"
|
||||
.PP
|
||||
.SH NAME
|
||||
.PP
|
||||
oddmu-man - print the manual pages
|
||||
.PP
|
||||
.SH SYNOPSIS
|
||||
.PP
|
||||
\fBoddmu man\fR
|
||||
.PP
|
||||
\fBoddmu man\fR \fItopic\fR
|
||||
.PP
|
||||
.SH DESCRIPTION
|
||||
.PP
|
||||
The "man" subcommand lists the topics available or prints the manual page for
|
||||
the given topic.\&
|
||||
.PP
|
||||
Example:
|
||||
.PP
|
||||
.nf
|
||||
.RS 4
|
||||
oddmu man apache
|
||||
.fi
|
||||
.RE
|
||||
.PP
|
||||
.SH SEE ALSO
|
||||
.PP
|
||||
\fIoddmu\fR(1)
|
||||
.PP
|
||||
.SH AUTHORS
|
||||
.PP
|
||||
Maintained by Alex Schroeder <alex@gnu.\&org>.\&
|
||||
30
man/oddmu-man.1.txt
Normal file
30
man/oddmu-man.1.txt
Normal file
@@ -0,0 +1,30 @@
|
||||
ODDMU-MAN(1)
|
||||
|
||||
# NAME
|
||||
|
||||
oddmu-man - print the manual pages
|
||||
|
||||
# SYNOPSIS
|
||||
|
||||
*oddmu man*
|
||||
|
||||
*oddmu man* _topic_
|
||||
|
||||
# DESCRIPTION
|
||||
|
||||
The "man" subcommand lists the topics available or prints the manual page for
|
||||
the given topic.
|
||||
|
||||
Example:
|
||||
|
||||
```
|
||||
oddmu man apache
|
||||
```
|
||||
|
||||
# SEE ALSO
|
||||
|
||||
_oddmu_(1)
|
||||
|
||||
# AUTHORS
|
||||
|
||||
Maintained by Alex Schroeder <alex@gnu.org>.
|
||||
@@ -5,7 +5,7 @@
|
||||
.nh
|
||||
.ad l
|
||||
.\" Begin generated content:
|
||||
.TH "ODDMU-NGINX" "5" "2025-07-16"
|
||||
.TH "ODDMU-NGINX" "5" "2026-01-03"
|
||||
.PP
|
||||
.SH NAME
|
||||
.PP
|
||||
@@ -27,7 +27,7 @@ section.\& Add a new \fIlocation\fR section after the existing \fIlocation\fR se
|
||||
.PP
|
||||
.nf
|
||||
.RS 4
|
||||
location ~ ^/(view|preview|diff|edit|save|add|append|upload|drop|search|archive)/ {
|
||||
location ~ ^/(view|preview|diff|edit|save|add|append|upload|drop|search|sitemap|archive)/ {
|
||||
proxy_pass http://localhost:8080;
|
||||
}
|
||||
.fi
|
||||
@@ -97,7 +97,7 @@ server configuration.\& On a Debian system, that'\&d be in
|
||||
.PP
|
||||
.nf
|
||||
.RS 4
|
||||
location ~ ^/(view|preview|diff|edit|save|add|append|upload|drop|search|archive)/ {
|
||||
location ~ ^/(view|preview|diff|edit|save|add|append|upload|drop|search|sitemap|archive)/ {
|
||||
proxy_pass http://unix:/run/oddmu/oddmu\&.sock:;
|
||||
}
|
||||
.fi
|
||||
|
||||
@@ -19,7 +19,7 @@ The site is defined in "/etc/nginx/sites-available/default", in the _server_
|
||||
section. Add a new _location_ section after the existing _location_ section:
|
||||
|
||||
```
|
||||
location ~ ^/(view|preview|diff|edit|save|add|append|upload|drop|search|archive)/ {
|
||||
location ~ ^/(view|preview|diff|edit|save|add|append|upload|drop|search|sitemap|archive)/ {
|
||||
proxy_pass http://localhost:8080;
|
||||
}
|
||||
```
|
||||
@@ -81,7 +81,7 @@ server configuration. On a Debian system, that'd be in
|
||||
"/etc/nginx/sites-available/default".
|
||||
|
||||
```
|
||||
location ~ ^/(view|preview|diff|edit|save|add|append|upload|drop|search|archive)/ {
|
||||
location ~ ^/(view|preview|diff|edit|save|add|append|upload|drop|search|sitemap|archive)/ {
|
||||
proxy_pass http://unix:/run/oddmu/oddmu.sock:;
|
||||
}
|
||||
```
|
||||
|
||||
@@ -5,7 +5,7 @@
|
||||
.nh
|
||||
.ad l
|
||||
.\" Begin generated content:
|
||||
.TH "ODDMU-RELEASES" "7" "2025-08-10"
|
||||
.TH "ODDMU-RELEASES" "7" "2026-02-11"
|
||||
.PP
|
||||
.SH NAME
|
||||
.PP
|
||||
@@ -15,6 +15,99 @@ oddmu-releases - what'\&s new?\&
|
||||
.PP
|
||||
This page lists user-visible features and template changes to consider.\&
|
||||
.PP
|
||||
.SS 1.21 (unreleased)
|
||||
.PP
|
||||
Write any missing templates when Oddmu starts up.\&
|
||||
.PP
|
||||
Add man subcommand to print manual pages.\&
|
||||
.PP
|
||||
Both of these features make it possible to distribute just the binary.\&
|
||||
.PP
|
||||
.SS 1.20 (2026)
|
||||
.PP
|
||||
Add -shrink and -glob options to the \fIstatic\fR subcommand.\& See \fIoddmu-static\fR(1)
|
||||
for more.\&
|
||||
.PP
|
||||
Some tools were used to check the code (goimports, golint, gocritic).\&
|
||||
Unfortunately, the resulting changes necessitates a change in the templates
|
||||
("feed.\&html", "preview.\&html", "search.\&html", "static.\&html", "view.\&html"):
|
||||
"{{.\&Html}}" must be changed to "{{.\&HTML}}".\& One way to do this:
|
||||
.PP
|
||||
.nf
|
||||
.RS 4
|
||||
find \&. -regex \&'\&.*/(feed|preview|search|static|view).html\&'
|
||||
-exec sed -i~ \&'s/{{\&.Html}}/{{\&.HTML}}/g\&' \&'{}\&' \&'+\&'
|
||||
.fi
|
||||
.RE
|
||||
.PP
|
||||
The \fIfeed\fR subcommand uses the page URL to extract a pubDate instead of relying
|
||||
on the file'\&s last modified time.\& For a complete feed (an archive), the last
|
||||
modified time is less important.\&
|
||||
.PP
|
||||
The feed for the index page is paginated, like other feeds.\& But since it grows
|
||||
faster than any of the feeds for hashtag pages, presumably, an extra features
|
||||
was added: on the first and on the last page of the feed, a link to the next or
|
||||
the previous year is added, if such a page exists.\& This works if at beginning of
|
||||
every year, you move all the entries on to a dedicated year page.\& You need to
|
||||
add the necessary links to the "feed.\&html" template.\& See \fIoddmu-templates\fR(5)
|
||||
for more.\&
|
||||
.PP
|
||||
Example:
|
||||
.PP
|
||||
.nf
|
||||
.RS 4
|
||||
<rss xmlns:atom="http://www\&.w3\&.org/2005/Atom" version="2\&.0"
|
||||
xmlns:fh="http://purl\&.org/syndication/history/1\&.0">
|
||||
…
|
||||
{{if \&.PrevYear}}
|
||||
<atom:link href="https://example\&.org/view/{{\&.Dir}}{{\&.PrevYear}}\&.rss?n={{\&.N}}"
|
||||
rel="previous" type="application/rss+xml"/>
|
||||
{{end}}
|
||||
{{if \&.NextYear}}
|
||||
<atom:link href="https://example\&.org/view/{{\&.Dir}}{{\&.NextYear}}\&.rss?n={{\&.N}}"
|
||||
rel="next" type="application/rss+xml"/>
|
||||
{{end}}
|
||||
…
|
||||
.fi
|
||||
.RE
|
||||
.PP
|
||||
Add \fIsitemap\fR subcommand and handler.\& See \fIoddmu-sitemap\fR(1) for more.\& If you
|
||||
want to make it available for search engines and the like, you most likely have
|
||||
to add it to your proxy configuration.\& See \fIoddmu-apache\fR(5) or \fIoddmu-nginx\fR(5)
|
||||
for more.\&
|
||||
.PP
|
||||
.SS 1.19 (2025)
|
||||
.PP
|
||||
Add \fIfeed\fR subcommand.\& This produces a "complete" feed.\&
|
||||
.PP
|
||||
Add feed pagination for the \fIfeed\fR action.\& This produces a "paginated" feed.\&
|
||||
.PP
|
||||
See RFC 5005 for more information.\&
|
||||
.PP
|
||||
If you like the idea of feed pagination (not a given since that also helps bots
|
||||
scrape your site!\&) you need to add the necessary links to the "feed.\&html"
|
||||
template.\& See \fIoddmu-templates\fR(5) for more.\&
|
||||
.PP
|
||||
Example, adding the feed history namespace:
|
||||
.PP
|
||||
.nf
|
||||
.RS 4
|
||||
<rss xmlns:atom="http://www\&.w3\&.org/2005/Atom" version="2\&.0"
|
||||
xmlns:fh="http://purl\&.org/syndication/history/1\&.0">
|
||||
…
|
||||
{{if \&.From}}
|
||||
<atom:link rel="previous" type="application/rss+xml"
|
||||
href="https://example\&.org/view/{{\&.Path}}\&.rss?from={{\&.Prev}}&n={{\&.N}}"/>
|
||||
{{end}}
|
||||
{{if \&.Next}}
|
||||
<atom:link rel="next" type="application/rss+xml"
|
||||
href="https://example\&.org/view/{{\&.Path}}\&.rss?from={{\&.Next}}&n={{\&.N}}"/>
|
||||
{{end}}
|
||||
{{if \&.Complete}}<fh:complete/>{{end}}
|
||||
…
|
||||
.fi
|
||||
.RE
|
||||
.PP
|
||||
.SS 1.18 (2025)
|
||||
.PP
|
||||
The \fIhashtags\fR gained the option of checking and fixing the hashtag pages by
|
||||
@@ -25,16 +118,15 @@ In an effort to remove features that can be handled by the web server, the
|
||||
for a better solution.\&
|
||||
.PP
|
||||
You probably need to remove a sentence linking to the list action from the
|
||||
upload template ("upload.\&html").\&
|
||||
"upload.\&html" template.\&
|
||||
.PP
|
||||
.SS 1.17 (2025)
|
||||
.PP
|
||||
You need to update the upload template ("upload.\&html").\& Many things have
|
||||
changed!\& See \fIoddmu-templates\fR(5) for more.\&
|
||||
You need to update the "upload.\&html" template.\& Many things have changed!\& See
|
||||
\fIoddmu-templates\fR(5) for more.\&
|
||||
.PP
|
||||
You probably want to ensure that the upload link on the view template
|
||||
("view.\&html") and others, if you added it, has a \fIfilename\fR and \fIpagename\fR
|
||||
parameters.\&
|
||||
You probably want to ensure that the upload link on the "view.\&html" template and
|
||||
others, if you added it, has a \fIfilename\fR and \fIpagename\fR parameters.\&
|
||||
.PP
|
||||
Example:
|
||||
.PP
|
||||
@@ -44,8 +136,8 @@ Example:
|
||||
.fi
|
||||
.RE
|
||||
.PP
|
||||
You need to change {{.\&Name}} to {{.\&Path}} when it is used in URLs, in the list
|
||||
template ("list.\&html").\& If you don'\&t do this, file deleting and rename may not
|
||||
You need to change {{.\&Name}} to {{.\&Path}} when it is used in URLs, in the
|
||||
"list.\&html" template.\& If you don'\&t do this, file deleting and rename may not
|
||||
work on files containing a comma, a semicolon, a questionmark or a hash
|
||||
character.\& This fix was necessary because URLs for files containing a
|
||||
questionmark or a hash character would end the path at this character and treat
|
||||
@@ -100,7 +192,7 @@ together with appropriate permission checks.\&
|
||||
See \fIoddmu-apache\fR(5) or \fIoddmu-nginx\fR(5) for example.\&
|
||||
.PP
|
||||
In addition to that, you might want a link to the \fIlist\fR action from one of the
|
||||
existing templates.\& For example, from upload.\&html:
|
||||
existing templates.\& For example, from the "upload.\&html" template:
|
||||
.PP
|
||||
.nf
|
||||
.RS 4
|
||||
@@ -137,10 +229,9 @@ These are the quotation marks currently supported: '\&foo'\& "foo" ‘foo’ ‚
|
||||
“foo” „foo“ ”foo” «foo» »foo« ‹foo› ›foo‹ 「foo」 「foo」 『foo』 – any such
|
||||
quoted text is searched as-is, including whitespace.\&
|
||||
.PP
|
||||
Add loading="lazy" for images in search.\&html
|
||||
.PP
|
||||
If you want to take advantage of this, you'\&ll need to adapt your "search.\&html"
|
||||
template accordingly.\& Use like this, for example:
|
||||
Add loading="lazy" for images in the search template.\& If you want to take
|
||||
advantage of this, you'\&ll need to adapt your "search.\&html" template accordingly.\&
|
||||
Use like this, for example:
|
||||
.PP
|
||||
.nf
|
||||
.RS 4
|
||||
@@ -200,10 +291,10 @@ If you want to take advantage of this, you'\&ll need to adapt your templates
|
||||
accordingly.\& The "preview.\&html" template is a mix of "view.\&html" and
|
||||
"edit.\&html".\&
|
||||
.PP
|
||||
There is an optional change to make to copies of \fIupload.\&html\fR if you upload
|
||||
multiple images at a time.\& Instead of showing just the link to the last upload,
|
||||
you can now show the link (and the images or links, if you want to) to all the
|
||||
files uploaded.\& Use like this, for example:
|
||||
There is an optional change to make to copies of the "upload.\&html" template if
|
||||
you upload multiple images at a time.\& Instead of showing just the link to the
|
||||
last upload, you can now show the link (and the images or links, if you want to)
|
||||
to all the files uploaded.\& Use like this, for example:
|
||||
.PP
|
||||
.nf
|
||||
.RS 4
|
||||
@@ -213,9 +304,9 @@ Links:<tt>{{range \&.Actual}}<br>{{end}}</tt>
|
||||
.PP
|
||||
.SS 1.9 (2024)
|
||||
.PP
|
||||
There is a change to make to copies of \fIupload.\&html\fR if subdirectories are being
|
||||
used.\& The \fILast\fR property no longer contains the directory.\& It has to be added
|
||||
to the template as follows:
|
||||
There is a change to make to copies of the "upload.\&html" template if
|
||||
subdirectories are being used.\& The \fILast\fR property no longer contains the
|
||||
directory.\& It has to be added to the template as follows:
|
||||
.PP
|
||||
.nf
|
||||
.RS 4
|
||||
@@ -243,7 +334,7 @@ The upload template can use the \fIToday\fR property.\&
|
||||
The upload template comes with JavaScript that allows users to paste images or
|
||||
drag and drop files.\&
|
||||
.PP
|
||||
The upload template changed the id for the filename field from `text` to `name`.\&
|
||||
The upload template changed the id for the filename field from "text" to "name".\&
|
||||
.PP
|
||||
The source repository now comes with example templates.\&
|
||||
.PP
|
||||
@@ -253,7 +344,7 @@ No user-visible changes.\& Documentation and code comments got better.\&
|
||||
.PP
|
||||
.SS 1.7 (2024)
|
||||
.PP
|
||||
Allow upload of multiple files.\& This requires an update to the \fIupload.\&html\fR
|
||||
Allow upload of multiple files.\& This requires an update to the "upload.\&html"
|
||||
template: Add the \fImultiple\fR attribute to the file input element and change the
|
||||
label from "file" to "files".\&
|
||||
.PP
|
||||
|
||||
@@ -8,6 +8,93 @@ oddmu-releases - what's new?
|
||||
|
||||
This page lists user-visible features and template changes to consider.
|
||||
|
||||
## 1.21 (unreleased)
|
||||
|
||||
Write any missing templates when Oddmu starts up.
|
||||
|
||||
Add man subcommand to print manual pages.
|
||||
|
||||
Both of these features make it possible to distribute just the binary.
|
||||
|
||||
## 1.20 (2026)
|
||||
|
||||
Add -shrink and -glob options to the _static_ subcommand. See _oddmu-static_(1)
|
||||
for more.
|
||||
|
||||
Some tools were used to check the code (goimports, golint, gocritic).
|
||||
Unfortunately, the resulting changes necessitates a change in the templates
|
||||
("feed.html", "preview.html", "search.html", "static.html", "view.html"):
|
||||
"{{.Html}}" must be changed to "{{.HTML}}". One way to do this:
|
||||
|
||||
```
|
||||
find . -regex '.*/\(feed\|preview\|search\|static\|view\)\.html' \
|
||||
-exec sed -i~ 's/{{.Html}}/{{.HTML}}/g' '{}' '+'
|
||||
```
|
||||
|
||||
The _feed_ subcommand uses the page URL to extract a pubDate instead of relying
|
||||
on the file's last modified time. For a complete feed (an archive), the last
|
||||
modified time is less important.
|
||||
|
||||
The feed for the index page is paginated, like other feeds. But since it grows
|
||||
faster than any of the feeds for hashtag pages, presumably, an extra features
|
||||
was added: on the first and on the last page of the feed, a link to the next or
|
||||
the previous year is added, if such a page exists. This works if at beginning of
|
||||
every year, you move all the entries on to a dedicated year page. You need to
|
||||
add the necessary links to the "feed.html" template. See _oddmu-templates_(5)
|
||||
for more.
|
||||
|
||||
Example:
|
||||
|
||||
```
|
||||
<rss xmlns:atom="http://www.w3.org/2005/Atom" version="2.0"
|
||||
xmlns:fh="http://purl.org/syndication/history/1.0">
|
||||
…
|
||||
{{if .PrevYear}}
|
||||
<atom:link href="https://example.org/view/{{.Dir}}{{.PrevYear}}.rss?n={{.N}}"
|
||||
rel="previous" type="application/rss+xml"/>
|
||||
{{end}}
|
||||
{{if .NextYear}}
|
||||
<atom:link href="https://example.org/view/{{.Dir}}{{.NextYear}}.rss?n={{.N}}"
|
||||
rel="next" type="application/rss+xml"/>
|
||||
{{end}}
|
||||
…
|
||||
```
|
||||
|
||||
Add _sitemap_ subcommand and handler. See _oddmu-sitemap_(1) for more. If you
|
||||
want to make it available for search engines and the like, you most likely have
|
||||
to add it to your proxy configuration. See _oddmu-apache_(5) or _oddmu-nginx_(5)
|
||||
for more.
|
||||
|
||||
## 1.19 (2025)
|
||||
|
||||
Add _feed_ subcommand. This produces a "complete" feed.
|
||||
|
||||
Add feed pagination for the _feed_ action. This produces a "paginated" feed.
|
||||
|
||||
See RFC 5005 for more information.
|
||||
|
||||
If you like the idea of feed pagination (not a given since that also helps bots
|
||||
scrape your site!) you need to add the necessary links to the "feed.html"
|
||||
template. See _oddmu-templates_(5) for more.
|
||||
|
||||
Example, adding the feed history namespace:
|
||||
|
||||
```
|
||||
<rss xmlns:atom="http://www.w3.org/2005/Atom" version="2.0"
|
||||
xmlns:fh="http://purl.org/syndication/history/1.0">
|
||||
…
|
||||
{{if .From}}
|
||||
<atom:link rel="previous" type="application/rss+xml"
|
||||
href="https://example.org/view/{{.Path}}.rss?from={{.Prev}}&n={{.N}}"/>
|
||||
{{end}}
|
||||
{{if .Next}}
|
||||
<atom:link rel="next" type="application/rss+xml"
|
||||
href="https://example.org/view/{{.Path}}.rss?from={{.Next}}&n={{.N}}"/>
|
||||
{{end}}
|
||||
{{if .Complete}}<fh:complete/>{{end}}
|
||||
…
|
||||
```
|
||||
|
||||
## 1.18 (2025)
|
||||
|
||||
The _hashtags_ gained the option of checking and fixing the hashtag pages by
|
||||
@@ -18,16 +105,15 @@ _list_, _delete_ and _rename_ actions were removed again. See _oddmu-webdav_(5)
|
||||
for a better solution.
|
||||
|
||||
You probably need to remove a sentence linking to the list action from the
|
||||
upload template ("upload.html").
|
||||
"upload.html" template.
|
||||
|
||||
## 1.17 (2025)
|
||||
|
||||
You need to update the upload template ("upload.html"). Many things have
|
||||
changed! See _oddmu-templates_(5) for more.
|
||||
You need to update the "upload.html" template. Many things have changed! See
|
||||
_oddmu-templates_(5) for more.
|
||||
|
||||
You probably want to ensure that the upload link on the view template
|
||||
("view.html") and others, if you added it, has a _filename_ and _pagename_
|
||||
parameters.
|
||||
You probably want to ensure that the upload link on the "view.html" template and
|
||||
others, if you added it, has a _filename_ and _pagename_ parameters.
|
||||
|
||||
Example:
|
||||
|
||||
@@ -35,8 +121,8 @@ Example:
|
||||
<a href="/upload/{{.Dir}}?filename={{.Base}}-1.jpg&pagename={{.Base}}">Upload</a>
|
||||
```
|
||||
|
||||
You need to change {{.Name}} to {{.Path}} when it is used in URLs, in the list
|
||||
template ("list.html"). If you don't do this, file deleting and rename may not
|
||||
You need to change {{.Name}} to {{.Path}} when it is used in URLs, in the
|
||||
"list.html" template. If you don't do this, file deleting and rename may not
|
||||
work on files containing a comma, a semicolon, a questionmark or a hash
|
||||
character. This fix was necessary because URLs for files containing a
|
||||
questionmark or a hash character would end the path at this character and treat
|
||||
@@ -91,7 +177,7 @@ together with appropriate permission checks.
|
||||
See _oddmu-apache_(5) or _oddmu-nginx_(5) for example.
|
||||
|
||||
In addition to that, you might want a link to the _list_ action from one of the
|
||||
existing templates. For example, from upload.html:
|
||||
existing templates. For example, from the "upload.html" template:
|
||||
|
||||
```
|
||||
<p>You can rename and delete files <a href="/list/{{.Dir}}">from the file list</a>.
|
||||
@@ -124,10 +210,9 @@ These are the quotation marks currently supported: 'foo' "foo" ‘foo’ ‚foo
|
||||
“foo” „foo“ ”foo” «foo» »foo« ‹foo› ›foo‹ 「foo」 「foo」 『foo』 – any such
|
||||
quoted text is searched as-is, including whitespace.
|
||||
|
||||
Add loading="lazy" for images in search.html
|
||||
|
||||
If you want to take advantage of this, you'll need to adapt your "search.html"
|
||||
template accordingly. Use like this, for example:
|
||||
Add loading="lazy" for images in the search template. If you want to take
|
||||
advantage of this, you'll need to adapt your "search.html" template accordingly.
|
||||
Use like this, for example:
|
||||
|
||||
```
|
||||
{{range .Items}}
|
||||
@@ -179,10 +264,10 @@ If you want to take advantage of this, you'll need to adapt your templates
|
||||
accordingly. The "preview.html" template is a mix of "view.html" and
|
||||
"edit.html".
|
||||
|
||||
There is an optional change to make to copies of _upload.html_ if you upload
|
||||
multiple images at a time. Instead of showing just the link to the last upload,
|
||||
you can now show the link (and the images or links, if you want to) to all the
|
||||
files uploaded. Use like this, for example:
|
||||
There is an optional change to make to copies of the "upload.html" template if
|
||||
you upload multiple images at a time. Instead of showing just the link to the
|
||||
last upload, you can now show the link (and the images or links, if you want to)
|
||||
to all the files uploaded. Use like this, for example:
|
||||
|
||||
```
|
||||
Links:<tt>{{range .Actual}}<br>{{end}}</tt>
|
||||
@@ -190,9 +275,9 @@ Links:<tt>{{range .Actual}}<br>{{end}}</tt>
|
||||
|
||||
## 1.9 (2024)
|
||||
|
||||
There is a change to make to copies of _upload.html_ if subdirectories are being
|
||||
used. The _Last_ property no longer contains the directory. It has to be added
|
||||
to the template as follows:
|
||||
There is a change to make to copies of the "upload.html" template if
|
||||
subdirectories are being used. The _Last_ property no longer contains the
|
||||
directory. It has to be added to the template as follows:
|
||||
|
||||
```
|
||||
{{if ne .Last ""}}
|
||||
@@ -216,7 +301,7 @@ The upload template can use the _Today_ property.
|
||||
The upload template comes with JavaScript that allows users to paste images or
|
||||
drag and drop files.
|
||||
|
||||
The upload template changed the id for the filename field from `text` to `name`.
|
||||
The upload template changed the id for the filename field from "text" to "name".
|
||||
|
||||
The source repository now comes with example templates.
|
||||
|
||||
@@ -226,7 +311,7 @@ No user-visible changes. Documentation and code comments got better.
|
||||
|
||||
## 1.7 (2024)
|
||||
|
||||
Allow upload of multiple files. This requires an update to the _upload.html_
|
||||
Allow upload of multiple files. This requires an update to the "upload.html"
|
||||
template: Add the _multiple_ attribute to the file input element and change the
|
||||
label from "file" to "files".
|
||||
|
||||
|
||||
49
man/oddmu-sitemap.1
Normal file
49
man/oddmu-sitemap.1
Normal file
@@ -0,0 +1,49 @@
|
||||
.\" Generated by scdoc 1.11.3
|
||||
.\" Complete documentation for this program is not available as a GNU info page
|
||||
.ie \n(.g .ds Aq \(aq
|
||||
.el .ds Aq '
|
||||
.nh
|
||||
.ad l
|
||||
.\" Begin generated content:
|
||||
.TH "ODDMU-SITEMAP" "1" "2026-01-03"
|
||||
.PP
|
||||
.SH NAME
|
||||
.PP
|
||||
oddmu-sitemap - print static sitemap.\&xml
|
||||
.PP
|
||||
.SH SYNOPSIS
|
||||
.PP
|
||||
\fBoddmu sitemap\fR [\fB-base\fR \fIURL\fR] [\fB-filter\fR \fIregexp\fR]
|
||||
.PP
|
||||
.SH DESCRIPTION
|
||||
.PP
|
||||
The "sitemap" subcommand prints the list of all pages in Sitemap format.\& Oddmu
|
||||
already serves the sitemap at the URL "/sitemap.\&xml" but if you'\&d prefer to
|
||||
provide a static file, use this command and redirect the output to a file called
|
||||
"sitemap.\&xml" in your document root at regular intervals.\&
|
||||
.PP
|
||||
If you do this, don'\&t proxy the "/sitemap" URL in the web server configuration.\&
|
||||
.PP
|
||||
Your "robots.\&txt" file, if you have one, should point at the sitemap you
|
||||
provide.\&
|
||||
.PP
|
||||
.SH OPTIONS
|
||||
.PP
|
||||
\fB-base\fR \fIURL\fR
|
||||
.RS 4
|
||||
The base URL is something like "https://example.\&org/view/".\&
|
||||
.RE
|
||||
\fB-filter\fR \fIregexp\fR
|
||||
.RS 4
|
||||
A regular expression matching the pages to exclude from the sitemap.\&
|
||||
This emulates the effect of the ODDMU_FILTER environment variable.\&
|
||||
.PP
|
||||
.RE
|
||||
.SH SEE ALSO
|
||||
.PP
|
||||
\fIoddmu\fR(1), \fIoddmu-filter\fR(7), \fIoddmu-apache\fR(1), \fIoddmu-nginx\fR(1),
|
||||
https://www.\&sitemaps.\&org/
|
||||
.PP
|
||||
.SH AUTHORS
|
||||
.PP
|
||||
Maintained by Alex Schroeder <alex@gnu.\&org>.\&
|
||||
38
man/oddmu-sitemap.1.txt
Normal file
38
man/oddmu-sitemap.1.txt
Normal file
@@ -0,0 +1,38 @@
|
||||
ODDMU-SITEMAP(1)
|
||||
|
||||
# NAME
|
||||
|
||||
oddmu-sitemap - print static sitemap.xml
|
||||
|
||||
# SYNOPSIS
|
||||
|
||||
*oddmu sitemap* [*-base* _URL_] [*-filter* _regexp_]
|
||||
|
||||
# DESCRIPTION
|
||||
|
||||
The "sitemap" subcommand prints the list of all pages in Sitemap format. Oddmu
|
||||
already serves the sitemap at the URL "/sitemap.xml" but if you'd prefer to
|
||||
provide a static file, use this command and redirect the output to a file called
|
||||
"sitemap.xml" in your document root at regular intervals.
|
||||
|
||||
If you do this, don't proxy the "/sitemap" URL in the web server configuration.
|
||||
|
||||
Your "robots.txt" file, if you have one, should point at the sitemap you
|
||||
provide.
|
||||
|
||||
# OPTIONS
|
||||
|
||||
*-base* _URL_
|
||||
The base URL is something like "https://example.org/view/".
|
||||
*-filter* _regexp_
|
||||
A regular expression matching the pages to exclude from the sitemap.
|
||||
This emulates the effect of the ODDMU_FILTER environment variable.
|
||||
|
||||
# SEE ALSO
|
||||
|
||||
_oddmu_(1), _oddmu-filter_(7), _oddmu-apache_(1), _oddmu-nginx_(1),
|
||||
https://www.sitemaps.org/
|
||||
|
||||
# AUTHORS
|
||||
|
||||
Maintained by Alex Schroeder <alex@gnu.org>.
|
||||
@@ -5,7 +5,7 @@
|
||||
.nh
|
||||
.ad l
|
||||
.\" Begin generated content:
|
||||
.TH "ODDMU-STATIC" "1" "2024-08-29"
|
||||
.TH "ODDMU-STATIC" "1" "2026-02-06"
|
||||
.PP
|
||||
.SH NAME
|
||||
.PP
|
||||
@@ -13,7 +13,7 @@ oddmu-static - create a static copy of the site
|
||||
.PP
|
||||
.SH SYNOPSIS
|
||||
.PP
|
||||
\fBoddmu static\fR \fIdir-name\fR
|
||||
\fBoddmu static\fR [\fB-jobs\fR \fIn\fR] [\fB-glob\fR \fIpattern\fR] [\fB-shrink\fR] \fIdir-name\fR
|
||||
.PP
|
||||
.SH DESCRIPTION
|
||||
.PP
|
||||
@@ -28,7 +28,8 @@ pages get ".\&html" appended.\&
|
||||
If a page has a name case-insensitively matching a hashtag, a feed file is
|
||||
generated (ending with ".\&rss") if any suitable links are found.\& A suitable link
|
||||
for a feed item must appear in a bullet list item using an asterisk ("*").\& If
|
||||
no feed items are found, no feed is written.\&
|
||||
no feed items are found, no feed is written.\& The feed is limited to the ten most
|
||||
recent items.\&
|
||||
.PP
|
||||
Hidden files and directories (starting with a ".\&") and backup files (ending with
|
||||
a "~") are skipped.\&
|
||||
@@ -38,12 +39,11 @@ the images take a lot more space than the text.\& On my blog in 2023 I had 2.\&6
|
||||
GiB of JPG files and 0.\&02 GiB of Markdown files.\& There is no point in copying
|
||||
all those images, most of the time.\&
|
||||
.PP
|
||||
Note, however: Hard links cannot span filesystems.\& A hard link is just an extra
|
||||
name for the same file.\& This is why the destination directory for the static
|
||||
site has to be on same filesystem as the current directory, if it contains any
|
||||
other files besides Markdown files.\&
|
||||
As hard links cannot span filesystems, all other files are \fIcopied\fR if the
|
||||
destination directory for the static site is not on same filesystem as the
|
||||
current directory.\&
|
||||
.PP
|
||||
Furthermore, in-place editing changes the file for all names.\& Avoid editing the
|
||||
Note that in-place editing changes the file for all names.\& Avoid editing the
|
||||
hard-linked files (anything that'\&s not a HTML file) in the destination
|
||||
directory, just to be on the safe side.\& Usually you should be fine, as an editor
|
||||
moves the file that'\&s being edited to a backup file and creates a new file.\& But
|
||||
@@ -51,6 +51,35 @@ then again, who knows.\& A SQLite file, for example, would change in-place, and
|
||||
therefore making changes to it in the destination directory would change the
|
||||
original, too.\&
|
||||
.PP
|
||||
.SH OPTIONS
|
||||
.PP
|
||||
\fB-jobs\fR \fIn\fR
|
||||
.RS 4
|
||||
By default, two jobs are used to process the files.\& If your machine has
|
||||
more cores, you can increase the number of jobs.\&
|
||||
.PP
|
||||
.RE
|
||||
\fB-glob\fR \fIpattern\fR
|
||||
.RS 4
|
||||
By default, all files are used for the static export.\& You can limit the
|
||||
files used by providing a shell file name pattern.\& A "*" matches any
|
||||
number of characters; a "?\&" matches exactly one character; "[a-z]"
|
||||
matches a character listed, including ranges; "[^a-z]" matches a
|
||||
character not listed, including ranges; "\e" a backslash escapes the
|
||||
following character.\& You must use quotes around the pattern if you are
|
||||
using a shell as the shell would otherwise expand the pattern, resulting
|
||||
in the error "Exactly one target directory is required".\&
|
||||
.PP
|
||||
.PP
|
||||
.RE
|
||||
\fB-shrink\fR
|
||||
.RS 4
|
||||
By default, images are linked or copied.\& With this option, JPEG, PNG and
|
||||
WebP files are scaled down if more than 800 pixels wide and the quality
|
||||
is set to 30% for JPEG and WebP files.\& This is \fIbad quality\fR but the
|
||||
result is that these image files are very small.\&
|
||||
.PP
|
||||
.RE
|
||||
.SH EXAMPLES
|
||||
.PP
|
||||
Generate a static copy of the site, but only loading language detection for
|
||||
@@ -89,7 +118,11 @@ speed language determination up.\&
|
||||
.PP
|
||||
.SH SEE ALSO
|
||||
.PP
|
||||
\fIoddmu\fR(1), \fIoddmu-templates\fR(5)
|
||||
See \fIoddmu\fR(1) and \fIoddmu-templates\fR(5) for general information.\&
|
||||
.PP
|
||||
See \fIoddmu-html\fR(1) for a subcommand that converts individual pages file to HTML
|
||||
and see \fIoddmu-feed\fR(1) for a subcommand that generates feeds for individual
|
||||
files.\&
|
||||
.PP
|
||||
.SH AUTHORS
|
||||
.PP
|
||||
|
||||
@@ -6,7 +6,7 @@ oddmu-static - create a static copy of the site
|
||||
|
||||
# SYNOPSIS
|
||||
|
||||
*oddmu static* _dir-name_
|
||||
*oddmu static* [*-jobs* _n_] [*-glob* _pattern_] [*-shrink*] _dir-name_
|
||||
|
||||
# DESCRIPTION
|
||||
|
||||
@@ -21,7 +21,8 @@ pages get ".html" appended.
|
||||
If a page has a name case-insensitively matching a hashtag, a feed file is
|
||||
generated (ending with ".rss") if any suitable links are found. A suitable link
|
||||
for a feed item must appear in a bullet list item using an asterisk ("\*"). If
|
||||
no feed items are found, no feed is written.
|
||||
no feed items are found, no feed is written. The feed is limited to the ten most
|
||||
recent items.
|
||||
|
||||
Hidden files and directories (starting with a ".") and backup files (ending with
|
||||
a "~") are skipped.
|
||||
@@ -31,12 +32,11 @@ the images take a lot more space than the text. On my blog in 2023 I had 2.62
|
||||
GiB of JPG files and 0.02 GiB of Markdown files. There is no point in copying
|
||||
all those images, most of the time.
|
||||
|
||||
Note, however: Hard links cannot span filesystems. A hard link is just an extra
|
||||
name for the same file. This is why the destination directory for the static
|
||||
site has to be on same filesystem as the current directory, if it contains any
|
||||
other files besides Markdown files.
|
||||
As hard links cannot span filesystems, all other files are _copied_ if the
|
||||
destination directory for the static site is not on same filesystem as the
|
||||
current directory.
|
||||
|
||||
Furthermore, in-place editing changes the file for all names. Avoid editing the
|
||||
Note that in-place editing changes the file for all names. Avoid editing the
|
||||
hard-linked files (anything that's not a HTML file) in the destination
|
||||
directory, just to be on the safe side. Usually you should be fine, as an editor
|
||||
moves the file that's being edited to a backup file and creates a new file. But
|
||||
@@ -44,6 +44,29 @@ then again, who knows. A SQLite file, for example, would change in-place, and
|
||||
therefore making changes to it in the destination directory would change the
|
||||
original, too.
|
||||
|
||||
# OPTIONS
|
||||
|
||||
*-jobs* _n_
|
||||
By default, two jobs are used to process the files. If your machine has
|
||||
more cores, you can increase the number of jobs.
|
||||
|
||||
*-glob* _pattern_
|
||||
By default, all files are used for the static export. You can limit the
|
||||
files used by providing a shell file name pattern. A "\*" matches any
|
||||
number of characters; a "?" matches exactly one character; "[a-z]"
|
||||
matches a character listed, including ranges; "[^a-z]" matches a
|
||||
character not listed, including ranges; "\\" a backslash escapes the
|
||||
following character. You must use quotes around the pattern if you are
|
||||
using a shell as the shell would otherwise expand the pattern, resulting
|
||||
in the error "Exactly one target directory is required".
|
||||
|
||||
|
||||
*-shrink*
|
||||
By default, images are linked or copied. With this option, JPEG, PNG and
|
||||
WebP files are scaled down if more than 800 pixels wide and the quality
|
||||
is set to 30% for JPEG and WebP files. This is _bad quality_ but the
|
||||
result is that these image files are very small.
|
||||
|
||||
# EXAMPLES
|
||||
|
||||
Generate a static copy of the site, but only loading language detection for
|
||||
@@ -80,7 +103,11 @@ speed language determination up.
|
||||
|
||||
# SEE ALSO
|
||||
|
||||
_oddmu_(1), _oddmu-templates_(5)
|
||||
See _oddmu_(1) and _oddmu-templates_(5) for general information.
|
||||
|
||||
See _oddmu-html_(1) for a subcommand that converts individual pages file to HTML
|
||||
and see _oddmu-feed_(1) for a subcommand that generates feeds for individual
|
||||
files.
|
||||
|
||||
# AUTHORS
|
||||
|
||||
|
||||
@@ -5,7 +5,7 @@
|
||||
.nh
|
||||
.ad l
|
||||
.\" Begin generated content:
|
||||
.TH "ODDMU-TEMPLATES" "5" "2025-04-26" "File Formats Manual"
|
||||
.TH "ODDMU-TEMPLATES" "5" "2026-01-03" "File Formats Manual"
|
||||
.PP
|
||||
.SH NAME
|
||||
.PP
|
||||
@@ -31,12 +31,12 @@ placeholders.\&
|
||||
.IP \(bu 4
|
||||
\fIfeed.\&html\fR uses a \fIfeed\fR
|
||||
.IP \(bu 4
|
||||
\fIlist.\&html\fR uses a \fIlist\fR
|
||||
.IP \(bu 4
|
||||
\fIpreview.\&html\fR uses a \fIpage\fR
|
||||
.IP \(bu 4
|
||||
\fIsearch.\&html\fR uses a \fIsearch\fR
|
||||
.IP \(bu 4
|
||||
\fIsitemap.\&html\fR uses a \fIsitemap\fR
|
||||
.IP \(bu 4
|
||||
\fIstatic.\&html\fR uses a \fIpage\fR
|
||||
.IP \(bu 4
|
||||
\fIupload.\&html\fR uses an \fIupload\fR
|
||||
@@ -132,31 +132,45 @@ An item is a page plus a date.\& All the properties of a page can be used (see
|
||||
.PP
|
||||
\fI{{.\&Date}}\fR is the date of the last update to the page, in RFC 822 format.\&
|
||||
.PP
|
||||
.SS List
|
||||
In order to paginate feeds, the following attributes are also available in the
|
||||
feed:
|
||||
.PP
|
||||
The list contains a directory name and an array of files.\&
|
||||
\fI{{.\&From}}\fR is the item number where the feed starts.\& The first page starts at
|
||||
0.\& This can be passed to Oddmu via the query parameter \fIfrom\fR.\&
|
||||
.PP
|
||||
\fI{{.\&Dir}}\fR is the directory name that is being listed, percent-encoded.\&
|
||||
\fI{{.\&N}}\fR is the number items per page.\& The default is 10.\& This can be passed to
|
||||
Oddmu via the query parameter \fIn\fR.\& If this is set to 0, the feed is not
|
||||
paginated.\&
|
||||
.PP
|
||||
\fI{{.\&Files}}\fR is the array of files.\& To refer to them, you need to use a \fI{{range
|
||||
Files}}\fR … \fI{{end}}\fR construct.\&
|
||||
\fI{{.\&Complete}}\fR is a boolean that is true if the feed is not paginated.\& Such a
|
||||
feed cannot have a previous or next page.\&
|
||||
.PP
|
||||
Each file has the following attributes:
|
||||
\fI{{.\&Prev}}\fR is the item number where the previous page of the feed starts.\& On
|
||||
the first page, it'\&s value is 0 instead of -10.\& You need to test if \fI{{.\&From}}\fR
|
||||
is non-zero (in which case this is not the first page) before using \fI{{.\&Prev}}\fR.\&
|
||||
.PP
|
||||
\fI{{.\&Name}}\fR is the filename.\& The ".\&md" suffix for Markdown files is part of the
|
||||
name (unlike page names).\&
|
||||
\fI{{.\&Next}}\fR is the item number where the next feed starts, if there are any
|
||||
items left.\& If there are none, it'\&s value is 0.\&
|
||||
.PP
|
||||
\fI{{.\&Path}}\fR is the page name, percent-encoded.\&
|
||||
\fI{{.\&PrevYear}}\fR is the year for the previous yearly archive.\& This is added on
|
||||
the index page or on year pages.\& Year pages are pages whose name is just a
|
||||
number (presumably a year).\& The property is only set on the first page of the
|
||||
feed, if the previous year page exists.\& The previous year is one higher than the
|
||||
year currently shown (if on a year page) or the current year (if looking at the
|
||||
index), since the feed goes backwards in time as new entries appear at the top.\&
|
||||
When looking at the page "2024" the previous page is "2025".\& Strangely enough,
|
||||
if the current year is 2026 but a page "2027" already exists, and the feed for
|
||||
the index page is generated, then "2027" (in the future) is the previous page.\&
|
||||
If the current year is 2026, the feed of the index page points to "2025" as the
|
||||
next year, if it exists.\& When the feed for "2025" is generated, however, the
|
||||
previous year is not set, assuming that the "2026" page does not yet exist and
|
||||
it is strange to consider the index page "the previous year" of "2025" in 2026.\&
|
||||
This might change in the future.\& If it isn'\&t set, it'\&s value is 0.\&
|
||||
.PP
|
||||
\fI{{.\&Title}}\fR is the page title, if the file in question is a Markdown file.\&
|
||||
.PP
|
||||
\fI{{.\&IsDir}}\fR is a boolean used to indicate that this file is a directory.\&
|
||||
.PP
|
||||
\fI{{.\&IsUp}}\fR is a boolean used to indicate the entry for the parent directory
|
||||
(the first file in the array, unless the directory being listed is the top
|
||||
directory).\& The filename of this file is ".\&.\&".\&
|
||||
.PP
|
||||
\fI{{.\&Date}}\fR is the last modification date of the file.\&
|
||||
\fI{{.\&NextYear}}\fR is the year for the next yearly archive.\& See above for an
|
||||
explanation.\& The next year is one lower than the year currently shown (if on a
|
||||
year page) or the current year (if looking at the index).\& If it isn'\&t set, it'\&s
|
||||
value is 0.\&
|
||||
.PP
|
||||
.SS Search
|
||||
.PP
|
||||
@@ -196,6 +210,16 @@ are only listed if a search term matches.\&
|
||||
\fI{{.\&Html}}\fR the image alt-text with a bold tag used to highlight the first
|
||||
search term that matched.\&
|
||||
.PP
|
||||
.SS Sitemap
|
||||
.PP
|
||||
The sitemap contains a list of URLs, each with its location:
|
||||
.PP
|
||||
\fI{{.\&URL}}\fR is the list of URLs.\&
|
||||
.PP
|
||||
Each URL has the following attributes:
|
||||
.PP
|
||||
\fI{{.\&Loc}}\fR with the actual page URL.\&
|
||||
.PP
|
||||
.SS Upload
|
||||
.PP
|
||||
\fI{{.\&Dir}}\fR is the directory where the uploaded file ends up, based on the URL
|
||||
|
||||
@@ -18,9 +18,9 @@ placeholders.
|
||||
- _diff.html_ uses a _page_
|
||||
- _edit.html_ uses a _page_
|
||||
- _feed.html_ uses a _feed_
|
||||
- _list.html_ uses a _list_
|
||||
- _preview.html_ uses a _page_
|
||||
- _search.html_ uses a _search_
|
||||
- _sitemap.html_ uses a _sitemap_
|
||||
- _static.html_ uses a _page_
|
||||
- _upload.html_ uses an _upload_
|
||||
- _view.html_ uses a _page_
|
||||
@@ -106,31 +106,45 @@ An item is a page plus a date. All the properties of a page can be used (see
|
||||
|
||||
_{{.Date}}_ is the date of the last update to the page, in RFC 822 format.
|
||||
|
||||
## List
|
||||
In order to paginate feeds, the following attributes are also available in the
|
||||
feed:
|
||||
|
||||
The list contains a directory name and an array of files.
|
||||
_{{.From}}_ is the item number where the feed starts. The first page starts at
|
||||
0. This can be passed to Oddmu via the query parameter _from_.
|
||||
|
||||
_{{.Dir}}_ is the directory name that is being listed, percent-encoded.
|
||||
_{{.N}}_ is the number items per page. The default is 10. This can be passed to
|
||||
Oddmu via the query parameter _n_. If this is set to 0, the feed is not
|
||||
paginated.
|
||||
|
||||
_{{.Files}}_ is the array of files. To refer to them, you need to use a _{{range
|
||||
.Files}}_ … _{{end}}_ construct.
|
||||
_{{.Complete}}_ is a boolean that is true if the feed is not paginated. Such a
|
||||
feed cannot have a previous or next page.
|
||||
|
||||
Each file has the following attributes:
|
||||
_{{.Prev}}_ is the item number where the previous page of the feed starts. On
|
||||
the first page, it's value is 0 instead of -10. You need to test if _{{.From}}_
|
||||
is non-zero (in which case this is not the first page) before using _{{.Prev}}_.
|
||||
|
||||
_{{.Name}}_ is the filename. The ".md" suffix for Markdown files is part of the
|
||||
name (unlike page names).
|
||||
_{{.Next}}_ is the item number where the next feed starts, if there are any
|
||||
items left. If there are none, it's value is 0.
|
||||
|
||||
_{{.Path}}_ is the page name, percent-encoded.
|
||||
_{{.PrevYear}}_ is the year for the previous yearly archive. This is added on
|
||||
the index page or on year pages. Year pages are pages whose name is just a
|
||||
number (presumably a year). The property is only set on the first page of the
|
||||
feed, if the previous year page exists. The previous year is one higher than the
|
||||
year currently shown (if on a year page) or the current year (if looking at the
|
||||
index), since the feed goes backwards in time as new entries appear at the top.
|
||||
When looking at the page "2024" the previous page is "2025". Strangely enough,
|
||||
if the current year is 2026 but a page "2027" already exists, and the feed for
|
||||
the index page is generated, then "2027" (in the future) is the previous page.
|
||||
If the current year is 2026, the feed of the index page points to "2025" as the
|
||||
next year, if it exists. When the feed for "2025" is generated, however, the
|
||||
previous year is not set, assuming that the "2026" page does not yet exist and
|
||||
it is strange to consider the index page "the previous year" of "2025" in 2026.
|
||||
This might change in the future. If it isn't set, it's value is 0.
|
||||
|
||||
_{{.Title}}_ is the page title, if the file in question is a Markdown file.
|
||||
|
||||
_{{.IsDir}}_ is a boolean used to indicate that this file is a directory.
|
||||
|
||||
_{{.IsUp}}_ is a boolean used to indicate the entry for the parent directory
|
||||
(the first file in the array, unless the directory being listed is the top
|
||||
directory). The filename of this file is "..".
|
||||
|
||||
_{{.Date}}_ is the last modification date of the file.
|
||||
_{{.NextYear}}_ is the year for the next yearly archive. See above for an
|
||||
explanation. The next year is one lower than the year currently shown (if on a
|
||||
year page) or the current year (if looking at the index). If it isn't set, it's
|
||||
value is 0.
|
||||
|
||||
## Search
|
||||
|
||||
@@ -170,6 +184,16 @@ _{{.Name}}_ is the file name for use in URLs.
|
||||
_{{.Html}}_ the image alt-text with a bold tag used to highlight the first
|
||||
search term that matched.
|
||||
|
||||
## Sitemap
|
||||
|
||||
The sitemap contains a list of URLs, each with its location:
|
||||
|
||||
_{{.URL}}_ is the list of URLs.
|
||||
|
||||
Each URL has the following attributes:
|
||||
|
||||
_{{.Loc}}_ with the actual page URL.
|
||||
|
||||
## Upload
|
||||
|
||||
_{{.Dir}}_ is the directory where the uploaded file ends up, based on the URL
|
||||
|
||||
96
man/oddmu.1
96
man/oddmu.1
@@ -5,7 +5,7 @@
|
||||
.nh
|
||||
.ad l
|
||||
.\" Begin generated content:
|
||||
.TH "ODDMU" "1" "2025-08-09"
|
||||
.TH "ODDMU" "1" "2026-02-11"
|
||||
.PP
|
||||
.SH NAME
|
||||
.PP
|
||||
@@ -32,10 +32,11 @@ create, most likely.\&
|
||||
.PP
|
||||
See \fIoddmu\fR(5) for details about the page formatting.\&
|
||||
.PP
|
||||
If you request a page that doesn'\&t exist, Oddmu tries to find a matching
|
||||
Markdown file by appending the extension ".\&md" to the page name.\& In the example
|
||||
above, the page name requested is "index" and the file name Oddmu tries to read
|
||||
is "index.\&md".\& If no such file exists, Oddmu offers you to create the page.\&
|
||||
If you request a file that exists, like "index.\&md", Oddmu serves it as-is.\& If
|
||||
you request a file that doesn'\&t exist, like "index", Oddmu checks if a matching
|
||||
Markdown file exists by appending the extension ".\&md".\& If such a file is found,
|
||||
it is turned into HTML and shown.\& If no such file exists, Oddmu offers you to
|
||||
create the page.\&
|
||||
.PP
|
||||
If your files don'\&t provide their own title ("# title"), the file name (without
|
||||
".\&md") is used for the page title.\&
|
||||
@@ -79,6 +80,8 @@ directory:
|
||||
.IP \(bu 4
|
||||
\fI/search/dir/?\&q=term\fR to search for a term
|
||||
.IP \(bu 4
|
||||
\fI/sitemap.\&xml\fR to list the links to all the pages
|
||||
.IP \(bu 4
|
||||
\fI/archive/dir/name.\&zip\fR to download a zip file of a directory
|
||||
.PD
|
||||
.PP
|
||||
@@ -96,9 +99,9 @@ curl --form body="Did you bring a towel?"
|
||||
When calling the \fIdrop\fR action, the query parameters used are \fIname\fR for the
|
||||
target filename and \fIfile\fR for the file to upload.\& If the query parameter
|
||||
\fImaxwidth\fR is set, an attempt is made to decode and resize the image.\& JPG, PNG,
|
||||
WEBP and HEIC files can be decoded.\& Only JPG and PNG files can be encoded,
|
||||
however.\& If the target name ends in \fI.\&jpg\fR, the \fIquality\fR query parameter is
|
||||
also taken into account.\& To upload some thumbnails:
|
||||
WEBP and HEIC files can be decoded.\& Only JPG, PNG and WEBP files can be encoded,
|
||||
however.\& If the target name ends in \fI.\&jpg\fR or \fI.\&png\fR, the \fIquality\fR query
|
||||
parameter is also taken into account.\& To upload some thumbnails:
|
||||
.PP
|
||||
.nf
|
||||
.RS 4
|
||||
@@ -119,12 +122,13 @@ curl \&'http://localhost:8080/search/?q=towel\&'
|
||||
.RE
|
||||
.PP
|
||||
The page name to act upon is optionally taken from the query parameter \fIid\fR.\& In
|
||||
this case, the directory must also be part of the query parameter and not of the
|
||||
URL path.\&
|
||||
this case, the directory must still be part of the path and may not be part of
|
||||
the \fIid\fR.\& This is enforced so that the path can be used by a webserver for
|
||||
access control.\&
|
||||
.PP
|
||||
.nf
|
||||
.RS 4
|
||||
curl \&'http://localhost:8080/view/?id=man/oddmu\&.1\&.txt\&'
|
||||
curl \&'http://localhost:8080/view/man/?id=oddmu\&.1\&.txt\&'
|
||||
.fi
|
||||
.RE
|
||||
.PP
|
||||
@@ -140,8 +144,9 @@ curl --remote-name \&'http://localhost:8080/archive/man/man\&.zip
|
||||
.PP
|
||||
.SH CONFIGURATION
|
||||
.PP
|
||||
The template files are the HTML files in the working directory.\& Please change
|
||||
these templates!\&
|
||||
The template files are the HTML files in the working directory.\& If they are
|
||||
missing, the default files are written to disk as soon as they are required.\&
|
||||
Please change these templates!\&
|
||||
.PP
|
||||
The first change you should make is to replace the name and email address in the
|
||||
footer of \fIview.\&html\fR.\& Look for "Your Name" and "example.\&org".\&
|
||||
@@ -300,55 +305,62 @@ current date of the machine Oddmu is running on is used.\& If a link already
|
||||
exists on the changes page, it is moved up to the current date.\& If that leaves
|
||||
an old date without any links, that date heading is removed.\&
|
||||
.PP
|
||||
If you want to link to the changes page, you need to do this yourself.\& Add a
|
||||
link from the index, for example.\& The "view.\&html" template currently doesn'\&t do
|
||||
it.\& See \fIoddmu-templates\fR(5) if you want to add the link to the template.\&
|
||||
.PP
|
||||
A page whose name starts with an ISO date (YYYY-MM-DD, e.\&g.\& "2023-10-28") is
|
||||
called a \fBblog\fR page.\& When creating or editing blog pages, links to it are added
|
||||
from other pages.\&
|
||||
from other pages as follows:
|
||||
.PP
|
||||
.PD 0
|
||||
.IP \(bu 4
|
||||
If the blog page name starts with the current year, a link is created from the
|
||||
index page back to the blog page being created or edited.\& Again, you can prevent
|
||||
this from happening by deselecting the checkbox "Add link to the list of
|
||||
changes.\&" The index page can be edited like every other page, so it'\&s easy to
|
||||
undo mistakes.\&
|
||||
index page back to the blog page being created or edited.\& Again, you can
|
||||
prevent this from happening by deselecting the checkbox "Add link to the list
|
||||
of changes.\&" The index page can be edited like every other page, so it'\&s easy
|
||||
to undo mistakes.\&
|
||||
.PD
|
||||
.PP
|
||||
.PD 0
|
||||
.IP \(bu 4
|
||||
For every \fBhashtag\fR used, another link might be created.\& If a page named like
|
||||
the hashtag exists, a backlink is added to it, linking to the new or edited blog
|
||||
page.\&
|
||||
the hashtag exists, a backlink is added to it, linking to the new or edited
|
||||
blog page.\&
|
||||
.PD
|
||||
.PP
|
||||
.PD 0
|
||||
.IP \(bu 4
|
||||
If a link to the new or edited blog page already exists but it'\&s title is no
|
||||
longer correct, it is updated.\&
|
||||
.PD
|
||||
.PP
|
||||
New links added for blog pages are added at the top of the first unnumbered list
|
||||
using the asterisk ('\&*'\&).\& If no such list exists, a new one is started at the
|
||||
bottom of the page.\& This allows you to have a different unnumbered list further
|
||||
up on the page, as long as it uses the minus for items ('\&-'\&).\&
|
||||
.PP
|
||||
Changes made locally do not create any links on the changes page, the index page
|
||||
or on any hashtag pages.\& See \fIoddmu-notify\fR(1) for a way to add the necessary
|
||||
links to the changes page and possibly to the index and hashtag pages.\&
|
||||
Changes made locally to the source files (using an editor) do not create any
|
||||
links on the changes page, the index page or on any hashtag pages.\& See
|
||||
\fIoddmu-notify\fR(1) for a way to add the necessary links to the changes page and
|
||||
possibly to the index and hashtag pages.\&
|
||||
.PP
|
||||
A hashtag consists of a number sign ('\&#'\&) followed by Unicode letters, numbers
|
||||
or the underscore ('\&_'\&).\& Thus, a hashtag ends with punctuation or whitespace.\&
|
||||
.PP
|
||||
The page names, titles and hashtags are loaded into memory when the server
|
||||
starts.\& If you have a lot of pages, this takes a lot of memory.\&
|
||||
starts.\& If you have a lot of pages, this takes a lot of memory.\& Oddmu watches
|
||||
the working directory and any subdirectories for changes made to page files and
|
||||
updates this cache when necessary.\&
|
||||
.PP
|
||||
Oddmu watches the working directory and any subdirectories for changes made
|
||||
directly.\& Thus, in theory, it'\&s not necessary to restart it after making such
|
||||
changes.\&
|
||||
.PP
|
||||
You cannot edit uploaded files.\& If you upload a file called "hello.\&txt" and
|
||||
attempt to edit it by using "/edit/hello.\&txt" you create a page with the name
|
||||
"hello.\&txt.\&md" instead.\&
|
||||
Uploaded files cannot be edited unless they end with ".\&md".\& If you upload a file
|
||||
called "hello.\&txt" and attempt to edit it by using "/edit/hello.\&txt" you create
|
||||
a page with the name "hello.\&txt.\&md" instead.\&
|
||||
.PP
|
||||
In order to delete uploaded files via the web, create an empty file and upload
|
||||
it.\& In order to delete a wiki page, save an empty page.\&
|
||||
.PP
|
||||
Note that some HTML file names are special: they act as templates.\& See
|
||||
\fIoddmu-templates\fR(5) for their names and their use.\&
|
||||
\fIoddmu-templates\fR(5) for their names and their use.\& Oddmu watches the working
|
||||
directory and any subdirectories for changes made to template files and reloads
|
||||
them.\& There is no need to restart the server after making changes to the
|
||||
templates.\&
|
||||
.PP
|
||||
.SH SEE ALSO
|
||||
.PP
|
||||
@@ -388,6 +400,8 @@ Oddmu running as a webserver:
|
||||
.IP \(bu 4
|
||||
\fIoddmu-html\fR(1), on how to render a page
|
||||
.IP \(bu 4
|
||||
\fIoddmu-feed\fR(1), on how to render a feed
|
||||
.IP \(bu 4
|
||||
\fIoddmu-list\fR(1), on how to list pages and titles
|
||||
.IP \(bu 4
|
||||
\fIoddmu-links\fR(1), on how to list the outgoing links for a page
|
||||
@@ -400,6 +414,8 @@ Oddmu running as a webserver:
|
||||
.IP \(bu 4
|
||||
\fIoddmu-search\fR(1), on how to run a search
|
||||
.IP \(bu 4
|
||||
\fIoddmu-sitemap\fR(1), on generating a static sitemap.\&xml
|
||||
.IP \(bu 4
|
||||
\fIoddmu-static\fR(1), on generating a static site
|
||||
.IP \(bu 4
|
||||
\fIoddmu-toc\fR(1), on how to list the table of contents (toc) a page
|
||||
@@ -414,6 +430,14 @@ If you want to stop using Oddmu:
|
||||
\fIoddmu-export\fR(1), on how to export all the files as one big RSS file
|
||||
.PD
|
||||
.PP
|
||||
And finally, if you don'\&t have the man pages, you can still read the original
|
||||
documents:
|
||||
.PP
|
||||
.PD 0
|
||||
.IP \(bu 4
|
||||
\fIoddmu-man\fR(1), to get help even if you don'\&t have the man pages installed
|
||||
.PD
|
||||
.PP
|
||||
.SH AUTHORS
|
||||
.PP
|
||||
Maintained by Alex Schroeder <alex@gnu.\&org>.\&
|
||||
|
||||
@@ -25,10 +25,11 @@ create, most likely.
|
||||
|
||||
See _oddmu_(5) for details about the page formatting.
|
||||
|
||||
If you request a page that doesn't exist, Oddmu tries to find a matching
|
||||
Markdown file by appending the extension ".md" to the page name. In the example
|
||||
above, the page name requested is "index" and the file name Oddmu tries to read
|
||||
is "index.md". If no such file exists, Oddmu offers you to create the page.
|
||||
If you request a file that exists, like "index.md", Oddmu serves it as-is. If
|
||||
you request a file that doesn't exist, like "index", Oddmu checks if a matching
|
||||
Markdown file exists by appending the extension ".md". If such a file is found,
|
||||
it is turned into HTML and shown. If no such file exists, Oddmu offers you to
|
||||
create the page.
|
||||
|
||||
If your files don't provide their own title ("# title"), the file name (without
|
||||
".md") is used for the page title.
|
||||
@@ -56,6 +57,7 @@ directory:
|
||||
- _/upload/dir/name_ shows a form to upload a file
|
||||
- _/drop/dir/name_ saves an upload
|
||||
- _/search/dir/?q=term_ to search for a term
|
||||
- _/sitemap.xml_ to list the links to all the pages
|
||||
- _/archive/dir/name.zip_ to download a zip file of a directory
|
||||
|
||||
When calling the _save_ and _append_ action, the page name is taken from the URL
|
||||
@@ -70,9 +72,9 @@ curl --form body="Did you bring a towel?" \
|
||||
When calling the _drop_ action, the query parameters used are _name_ for the
|
||||
target filename and _file_ for the file to upload. If the query parameter
|
||||
_maxwidth_ is set, an attempt is made to decode and resize the image. JPG, PNG,
|
||||
WEBP and HEIC files can be decoded. Only JPG and PNG files can be encoded,
|
||||
however. If the target name ends in _.jpg_, the _quality_ query parameter is
|
||||
also taken into account. To upload some thumbnails:
|
||||
WEBP and HEIC files can be decoded. Only JPG, PNG and WEBP files can be encoded,
|
||||
however. If the target name ends in _.jpg_ or _.png_, the _quality_ query
|
||||
parameter is also taken into account. To upload some thumbnails:
|
||||
|
||||
```
|
||||
for f in *.jpg; do
|
||||
@@ -89,11 +91,12 @@ curl 'http://localhost:8080/search/?q=towel'
|
||||
```
|
||||
|
||||
The page name to act upon is optionally taken from the query parameter _id_. In
|
||||
this case, the directory must also be part of the query parameter and not of the
|
||||
URL path.
|
||||
this case, the directory must still be part of the path and may not be part of
|
||||
the _id_. This is enforced so that the path can be used by a webserver for
|
||||
access control.
|
||||
|
||||
```
|
||||
curl 'http://localhost:8080/view/?id=man/oddmu.1.txt'
|
||||
curl 'http://localhost:8080/view/man/?id=oddmu.1.txt'
|
||||
```
|
||||
|
||||
The base name for the _archive_ action is used by the browser to save the
|
||||
@@ -106,8 +109,9 @@ curl --remote-name 'http://localhost:8080/archive/man/man.zip
|
||||
|
||||
# CONFIGURATION
|
||||
|
||||
The template files are the HTML files in the working directory. Please change
|
||||
these templates!
|
||||
The template files are the HTML files in the working directory. If they are
|
||||
missing, the default files are written to disk as soon as they are required.
|
||||
Please change these templates!
|
||||
|
||||
The first change you should make is to replace the name and email address in the
|
||||
footer of _view.html_. Look for "Your Name" and "example.org".
|
||||
@@ -246,55 +250,53 @@ current date of the machine Oddmu is running on is used. If a link already
|
||||
exists on the changes page, it is moved up to the current date. If that leaves
|
||||
an old date without any links, that date heading is removed.
|
||||
|
||||
If you want to link to the changes page, you need to do this yourself. Add a
|
||||
link from the index, for example. The "view.html" template currently doesn't do
|
||||
it. See _oddmu-templates_(5) if you want to add the link to the template.
|
||||
|
||||
A page whose name starts with an ISO date (YYYY-MM-DD, e.g. "2023-10-28") is
|
||||
called a *blog* page. When creating or editing blog pages, links to it are added
|
||||
from other pages.
|
||||
from other pages as follows:
|
||||
|
||||
If the blog page name starts with the current year, a link is created from the
|
||||
index page back to the blog page being created or edited. Again, you can prevent
|
||||
this from happening by deselecting the checkbox "Add link to the list of
|
||||
changes." The index page can be edited like every other page, so it's easy to
|
||||
undo mistakes.
|
||||
- If the blog page name starts with the current year, a link is created from the
|
||||
index page back to the blog page being created or edited. Again, you can
|
||||
prevent this from happening by deselecting the checkbox "Add link to the list
|
||||
of changes." The index page can be edited like every other page, so it's easy
|
||||
to undo mistakes.
|
||||
|
||||
For every *hashtag* used, another link might be created. If a page named like
|
||||
the hashtag exists, a backlink is added to it, linking to the new or edited blog
|
||||
page.
|
||||
- For every *hashtag* used, another link might be created. If a page named like
|
||||
the hashtag exists, a backlink is added to it, linking to the new or edited
|
||||
blog page.
|
||||
|
||||
If a link to the new or edited blog page already exists but it's title is no
|
||||
longer correct, it is updated.
|
||||
- If a link to the new or edited blog page already exists but it's title is no
|
||||
longer correct, it is updated.
|
||||
|
||||
New links added for blog pages are added at the top of the first unnumbered list
|
||||
using the asterisk ('\*'). If no such list exists, a new one is started at the
|
||||
bottom of the page. This allows you to have a different unnumbered list further
|
||||
up on the page, as long as it uses the minus for items ('-').
|
||||
|
||||
Changes made locally do not create any links on the changes page, the index page
|
||||
or on any hashtag pages. See _oddmu-notify_(1) for a way to add the necessary
|
||||
links to the changes page and possibly to the index and hashtag pages.
|
||||
Changes made locally to the source files (using an editor) do not create any
|
||||
links on the changes page, the index page or on any hashtag pages. See
|
||||
_oddmu-notify_(1) for a way to add the necessary links to the changes page and
|
||||
possibly to the index and hashtag pages.
|
||||
|
||||
A hashtag consists of a number sign ('#') followed by Unicode letters, numbers
|
||||
or the underscore ('\_'). Thus, a hashtag ends with punctuation or whitespace.
|
||||
|
||||
The page names, titles and hashtags are loaded into memory when the server
|
||||
starts. If you have a lot of pages, this takes a lot of memory.
|
||||
starts. If you have a lot of pages, this takes a lot of memory. Oddmu watches
|
||||
the working directory and any subdirectories for changes made to page files and
|
||||
updates this cache when necessary.
|
||||
|
||||
Oddmu watches the working directory and any subdirectories for changes made
|
||||
directly. Thus, in theory, it's not necessary to restart it after making such
|
||||
changes.
|
||||
|
||||
You cannot edit uploaded files. If you upload a file called "hello.txt" and
|
||||
attempt to edit it by using "/edit/hello.txt" you create a page with the name
|
||||
"hello.txt.md" instead.
|
||||
Uploaded files cannot be edited unless they end with ".md". If you upload a file
|
||||
called "hello.txt" and attempt to edit it by using "/edit/hello.txt" you create
|
||||
a page with the name "hello.txt.md" instead.
|
||||
|
||||
In order to delete uploaded files via the web, create an empty file and upload
|
||||
it. In order to delete a wiki page, save an empty page.
|
||||
|
||||
Note that some HTML file names are special: they act as templates. See
|
||||
_oddmu-templates_(5) for their names and their use.
|
||||
_oddmu-templates_(5) for their names and their use. Oddmu watches the working
|
||||
directory and any subdirectories for changes made to template files and reloads
|
||||
them. There is no need to restart the server after making changes to the
|
||||
templates.
|
||||
|
||||
# SEE ALSO
|
||||
|
||||
@@ -317,12 +319,14 @@ Oddmu running as a webserver:
|
||||
|
||||
- _oddmu-hashtags_(1), on working with hashtags
|
||||
- _oddmu-html_(1), on how to render a page
|
||||
- _oddmu-feed_(1), on how to render a feed
|
||||
- _oddmu-list_(1), on how to list pages and titles
|
||||
- _oddmu-links_(1), on how to list the outgoing links for a page
|
||||
- _oddmu-missing_(1), on how to find broken local links
|
||||
- _oddmu-notify_(1), on updating index, changes and hashtag pages
|
||||
- _oddmu-replace_(1), on how to search and replace text
|
||||
- _oddmu-search_(1), on how to run a search
|
||||
- _oddmu-sitemap_(1), on generating a static sitemap.xml
|
||||
- _oddmu-static_(1), on generating a static site
|
||||
- _oddmu-toc_(1), on how to list the table of contents (toc) a page
|
||||
- _oddmu-version_(1), on how to get all the build information from the binary
|
||||
@@ -331,6 +335,11 @@ If you want to stop using Oddmu:
|
||||
|
||||
- _oddmu-export_(1), on how to export all the files as one big RSS file
|
||||
|
||||
And finally, if you don't have the man pages, you can still read the original
|
||||
documents:
|
||||
|
||||
- _oddmu-man_(1), to get help even if you don't have the man pages installed
|
||||
|
||||
# AUTHORS
|
||||
|
||||
Maintained by Alex Schroeder <alex@gnu.org>.
|
||||
|
||||
81
man_cmd.go
Normal file
81
man_cmd.go
Normal file
@@ -0,0 +1,81 @@
|
||||
package main
|
||||
|
||||
import (
|
||||
"context"
|
||||
"embed"
|
||||
"fmt"
|
||||
"flag"
|
||||
"io"
|
||||
"os"
|
||||
"slices"
|
||||
"strings"
|
||||
|
||||
"github.com/google/subcommands"
|
||||
)
|
||||
|
||||
type manCmd struct {
|
||||
}
|
||||
|
||||
func (cmd *manCmd) SetFlags(f *flag.FlagSet) {
|
||||
}
|
||||
|
||||
func (*manCmd) Name() string { return "man" }
|
||||
func (*manCmd) Synopsis() string { return "show a manual page" }
|
||||
func (*manCmd) Usage() string {
|
||||
return `man <topic>:
|
||||
Print a manual page on a topic. If no topic is given, the
|
||||
available topics are listed. Substrings are possible.
|
||||
`
|
||||
}
|
||||
|
||||
// A filesystem with a read-only copy of the man pages at build time.
|
||||
//
|
||||
//go:embed man/*.txt
|
||||
var manFiles embed.FS
|
||||
|
||||
func (cmd *manCmd) Execute(_ context.Context, f *flag.FlagSet, _ ...interface{}) subcommands.ExitStatus {
|
||||
topic := strings.Join(f.Args(), " ")
|
||||
return manCli(os.Stdout, topic)
|
||||
}
|
||||
|
||||
func manCli(w io.Writer, topic string) subcommands.ExitStatus {
|
||||
entries, err := manFiles.ReadDir("man")
|
||||
if err != nil {
|
||||
fmt.Println("An error in the build resulted in unreadable manual pages:", err)
|
||||
return subcommands.ExitFailure
|
||||
}
|
||||
if (topic == "") {
|
||||
fmt.Println("Topics:")
|
||||
names := []string{}
|
||||
for _, entry := range entries {
|
||||
names = append(names, entry.Name())
|
||||
}
|
||||
slices.Sort(names)
|
||||
for _, name := range names {
|
||||
fmt.Println(name)
|
||||
}
|
||||
return subcommands.ExitSuccess
|
||||
}
|
||||
var candidate string
|
||||
for _, entry := range entries {
|
||||
name := entry.Name()
|
||||
if strings.Contains(name, topic) {
|
||||
if candidate != "" {
|
||||
fmt.Printf("The topic '%s' matches both %s and %s, maybe more.\n", topic, candidate, name)
|
||||
fmt.Println("Please be more specific.")
|
||||
return subcommands.ExitFailure
|
||||
}
|
||||
candidate = name
|
||||
}
|
||||
}
|
||||
if candidate == "" {
|
||||
fmt.Printf("No manual page matching topic '%s' found\n", topic)
|
||||
return subcommands.ExitFailure
|
||||
}
|
||||
b, err := manFiles.ReadFile("man/" + candidate)
|
||||
if err != nil {
|
||||
return subcommands.ExitFailure
|
||||
}
|
||||
os.Stdout.Write(b)
|
||||
return subcommands.ExitSuccess
|
||||
}
|
||||
64
man_test.go
64
man_test.go
@@ -1,7 +1,6 @@
|
||||
package main
|
||||
|
||||
import (
|
||||
"github.com/stretchr/testify/assert"
|
||||
"go/parser"
|
||||
"go/token"
|
||||
"io/fs"
|
||||
@@ -12,6 +11,8 @@ import (
|
||||
"sort"
|
||||
"strings"
|
||||
"testing"
|
||||
|
||||
"github.com/stretchr/testify/assert"
|
||||
)
|
||||
|
||||
// Does oddmu(1) link to all the other man pages?
|
||||
@@ -60,6 +61,40 @@ func TestManTemplates(t *testing.T) {
|
||||
assert.Greater(t, count, 0, "no templates were found")
|
||||
}
|
||||
|
||||
// Does oddmu-templates(5) mention all the template attributes?
|
||||
func TestManTemplateAttributes(t *testing.T) {
|
||||
mfp := "man/oddmu-templates.5.txt"
|
||||
b, err := os.ReadFile(mfp)
|
||||
man := string(b)
|
||||
assert.NoError(t, err)
|
||||
re := regexp.MustCompile(`{{(?:(?:if|range) )?(\.[A-Z][a-z]*)}}`)
|
||||
filepath.Walk(".", func(fp string, info fs.FileInfo, err error) error {
|
||||
if err != nil {
|
||||
return err
|
||||
}
|
||||
if fp != "." && info.IsDir() {
|
||||
return filepath.SkipDir
|
||||
}
|
||||
if !strings.HasSuffix(fp, ".html") {
|
||||
return nil
|
||||
}
|
||||
h, err := os.ReadFile(fp)
|
||||
matches := re.FindAllSubmatch(h, -1)
|
||||
assert.Greater(t, len(matches), 0, "%s contains no attributes", fp)
|
||||
seen := make(map[string]bool)
|
||||
for _, m := range matches {
|
||||
attr := string(m[1])
|
||||
if seen[attr] {
|
||||
continue
|
||||
}
|
||||
seen[attr] = true
|
||||
assert.Contains(t, man, "_{{"+attr+"}}_", "%s does not mention _{{%s}}_", mfp, attr)
|
||||
}
|
||||
assert.NoError(t, err)
|
||||
return nil
|
||||
})
|
||||
}
|
||||
|
||||
// Does oddmu(1) mention all the actions? We're not going to parse the go file and make sure to catch them all. I tried
|
||||
// it, and it's convoluted.
|
||||
func TestManActions(t *testing.T) {
|
||||
@@ -71,7 +106,7 @@ func TestManActions(t *testing.T) {
|
||||
wiki := string(b)
|
||||
count := 0
|
||||
// this doesn't match the root handler
|
||||
re := regexp.MustCompile(`\.HandleFunc\("(/[a-z]+/)", makeHandler\([a-z]+Handler, (true|false)(, http\.Method(Get|Post))+\)\)`)
|
||||
re := regexp.MustCompile(`mux\.HandleFunc\("(/[a-z]+/)", makeHandler\([a-z]+Handler, (true|false)(, http\.Method(Get|Post))+\)\)`)
|
||||
for _, match := range re.FindAllStringSubmatch(wiki, -1) {
|
||||
count++
|
||||
var path string
|
||||
@@ -87,6 +122,27 @@ func TestManActions(t *testing.T) {
|
||||
assert.Contains(t, main, "\n- _/_", "root")
|
||||
}
|
||||
|
||||
// Does oddmu(1) mention all the commands?
|
||||
func TestManCommands(t *testing.T) {
|
||||
b, err := os.ReadFile("man/oddmu.1.txt")
|
||||
assert.NoError(t, err)
|
||||
main := string(b)
|
||||
b, err = os.ReadFile("wiki.go")
|
||||
assert.NoError(t, err)
|
||||
wiki := string(b)
|
||||
count := 0
|
||||
re := regexp.MustCompile(`subcommands\.Register\(&([a-z]+)Cmd`)
|
||||
for _, match := range re.FindAllStringSubmatch(wiki, -1) {
|
||||
count++
|
||||
command := match[1]
|
||||
ref := "_oddmu-" + command + "_(1)"
|
||||
assert.Contains(t, main, ref, "link to the '%s' command", command)
|
||||
}
|
||||
assert.Greater(t, count, 0, "no commands were found")
|
||||
// root handler is manual
|
||||
assert.Contains(t, main, "\n- _/_", "root")
|
||||
}
|
||||
|
||||
// Does the README link to all the man pages and all the Go source files,
|
||||
// excluding the command and test files?
|
||||
func TestReadme(t *testing.T) {
|
||||
@@ -116,7 +172,9 @@ func TestReadme(t *testing.T) {
|
||||
}
|
||||
if strings.HasSuffix(fp, ".go") &&
|
||||
!strings.HasSuffix(fp, "_test.go") &&
|
||||
!strings.HasSuffix(fp, "_cmd.go") {
|
||||
!strings.HasSuffix(fp, "_cmd.go") &&
|
||||
!strings.HasSuffix(fp, "_common.go") &&
|
||||
!strings.HasSuffix(fp, "_nowebp.go") {
|
||||
count++
|
||||
s := strings.TrimPrefix(fp, "./")
|
||||
ref := "`" + s + "`"
|
||||
|
||||
@@ -4,14 +4,15 @@ import (
|
||||
"context"
|
||||
"flag"
|
||||
"fmt"
|
||||
"github.com/gomarkdown/markdown"
|
||||
"github.com/gomarkdown/markdown/ast"
|
||||
"github.com/google/subcommands"
|
||||
"io"
|
||||
"net/url"
|
||||
"os"
|
||||
"path"
|
||||
"strings"
|
||||
|
||||
"github.com/gomarkdown/markdown"
|
||||
"github.com/gomarkdown/markdown/ast"
|
||||
"github.com/google/subcommands"
|
||||
)
|
||||
|
||||
type missingCmd struct {
|
||||
@@ -32,6 +33,12 @@ func (cmd *missingCmd) SetFlags(f *flag.FlagSet) {
|
||||
}
|
||||
|
||||
func (cmd *missingCmd) Execute(_ context.Context, f *flag.FlagSet, _ ...interface{}) subcommands.ExitStatus {
|
||||
n, err := index.load()
|
||||
if err != nil {
|
||||
fmt.Fprintf(os.Stderr, "Index load: %s\n", err)
|
||||
return subcommands.ExitFailure
|
||||
}
|
||||
fmt.Fprintf(os.Stderr, "Indexed %d pages\n", n)
|
||||
return missingCli(os.Stdout, &index)
|
||||
}
|
||||
|
||||
@@ -94,8 +101,7 @@ func (p *Page) links() []string {
|
||||
doc := markdown.Parse(p.Body, parser)
|
||||
ast.WalkFunc(doc, func(node ast.Node, entering bool) ast.WalkStatus {
|
||||
if entering {
|
||||
switch v := node.(type) {
|
||||
case *ast.Link:
|
||||
if v, ok := node.(*ast.Link); ok {
|
||||
link := string(v.Destination)
|
||||
url, err := url.Parse(link)
|
||||
if err != nil {
|
||||
|
||||
@@ -2,9 +2,10 @@ package main
|
||||
|
||||
import (
|
||||
"bytes"
|
||||
"testing"
|
||||
|
||||
"github.com/google/subcommands"
|
||||
"github.com/stretchr/testify/assert"
|
||||
"testing"
|
||||
)
|
||||
|
||||
func TestMissingCmd(t *testing.T) {
|
||||
|
||||
@@ -4,10 +4,11 @@ import (
|
||||
"context"
|
||||
"flag"
|
||||
"fmt"
|
||||
"github.com/google/subcommands"
|
||||
"io"
|
||||
"os"
|
||||
"strings"
|
||||
|
||||
"github.com/google/subcommands"
|
||||
)
|
||||
|
||||
type notifyCmd struct {
|
||||
@@ -31,13 +32,12 @@ func (cmd *notifyCmd) Execute(_ context.Context, f *flag.FlagSet, _ ...interface
|
||||
}
|
||||
|
||||
func notifyCli(w io.Writer, args []string) subcommands.ExitStatus {
|
||||
index.load()
|
||||
for _, name := range args {
|
||||
if !strings.HasSuffix(name, ".md") {
|
||||
fmt.Fprintf(os.Stderr, "%s does not end in '.md'\n", name)
|
||||
return subcommands.ExitFailure
|
||||
}
|
||||
name = name[0:len(name)-3]
|
||||
name = name[0 : len(name)-3]
|
||||
p, err := loadPage(name)
|
||||
if err != nil {
|
||||
fmt.Fprintf(w, "Loading %s: %s\n", name, err)
|
||||
|
||||
30
page.go
30
page.go
@@ -2,7 +2,6 @@ package main
|
||||
|
||||
import (
|
||||
"bytes"
|
||||
"github.com/microcosm-cc/bluemonday"
|
||||
"html/template"
|
||||
"log"
|
||||
"net/url"
|
||||
@@ -12,6 +11,8 @@ import (
|
||||
"regexp"
|
||||
"strings"
|
||||
"time"
|
||||
|
||||
"github.com/microcosm-cc/bluemonday"
|
||||
)
|
||||
|
||||
// Page is a struct containing information about a single page. Title is the title extracted from the page content using
|
||||
@@ -21,7 +22,7 @@ type Page struct {
|
||||
Title string
|
||||
Name string
|
||||
Body []byte
|
||||
Html template.HTML
|
||||
HTML template.HTML
|
||||
Hashtags []string
|
||||
}
|
||||
|
||||
@@ -29,7 +30,7 @@ type Page struct {
|
||||
// the Name "foo").
|
||||
type Link struct {
|
||||
Title string
|
||||
Url string
|
||||
URL string
|
||||
}
|
||||
|
||||
// blogRe is a regular expression that matches blog pages. If the filename of a blog page starts with an ISO date
|
||||
@@ -61,7 +62,7 @@ func nameEscape(s string) string {
|
||||
}
|
||||
|
||||
// save saves a Page. The path is based on the Page.Name and gets the ".md" extension. Page.Body is saved, without any
|
||||
// carriage return characters ("\r"). Page.Title and Page.Html are not saved. There is no caching. Before removing or
|
||||
// carriage return characters ("\r"). Page.Title and Page.HTML are not saved. There is no caching. Before removing or
|
||||
// writing a file, the old copy is renamed to a backup, appending "~". Errors are not logged but returned.
|
||||
func (p *Page) save() error {
|
||||
fp := filepath.FromSlash(p.Name) + ".md"
|
||||
@@ -88,6 +89,17 @@ func (p *Page) save() error {
|
||||
return os.WriteFile(fp, s, 0644)
|
||||
}
|
||||
|
||||
// ModTime returns the last modification time of the page file. If the page does not exist, the current time is
|
||||
// returned.
|
||||
func (p *Page) ModTime() (time.Time, error) {
|
||||
fp := filepath.FromSlash(p.Name) + ".md"
|
||||
fi, err := os.Stat(fp)
|
||||
if err != nil {
|
||||
return time.Now(), err
|
||||
}
|
||||
return fi.ModTime(), nil
|
||||
}
|
||||
|
||||
// backup a file by renaming it unless the existing backup is less than an hour old. A backup gets a tilde appended to
|
||||
// it ("~"). This is true even if the file refers to a binary file like "image.png" and most applications don't know
|
||||
// what to do with a file called "image.png~". This expects a filepath. The backup file gets its modification time set
|
||||
@@ -111,7 +123,7 @@ func backup(fp string) error {
|
||||
}
|
||||
|
||||
// loadPage loads a Page given a name. The path loaded is that Page.Name with the ".md" extension. The Page.Title is set
|
||||
// to the Page.Name (and possibly changed, later). The Page.Body is set to the file content. The Page.Html remains
|
||||
// to the Page.Name (and possibly changed, later). The Page.Body is set to the file content. The Page.HTML remains
|
||||
// undefined (there is no caching).
|
||||
func loadPage(name string) (*Page, error) {
|
||||
name = strings.TrimPrefix(name, "./") // result of a path.TreeWalk starting with "."
|
||||
@@ -136,10 +148,10 @@ func (p *Page) handleTitle(replace bool) {
|
||||
}
|
||||
}
|
||||
|
||||
// summarize sets Page.Html to an extract.
|
||||
// summarize sets Page.HTML to an extract.
|
||||
func (p *Page) summarize(q string) {
|
||||
t := p.plainText()
|
||||
p.Html = sanitizeStrict(snippets(q, t))
|
||||
p.HTML = sanitizeStrict(snippets(q, t))
|
||||
}
|
||||
|
||||
// IsBlog returns true if the page name starts with an ISO date
|
||||
@@ -163,7 +175,7 @@ func pathEncode(s string) string {
|
||||
if n == 0 {
|
||||
return s
|
||||
}
|
||||
t := make([]byte, len(s) + 2*n)
|
||||
t := make([]byte, len(s)+2*n)
|
||||
j := 0
|
||||
for i := 0; i < len(s); i++ {
|
||||
switch s[i] {
|
||||
@@ -223,7 +235,7 @@ func (p *Page) Parents() []*Link {
|
||||
if !ok {
|
||||
title = "…"
|
||||
}
|
||||
link := &Link{Title: title, Url: strings.Repeat("../", len(elems)-i-1) + "index"}
|
||||
link := &Link{Title: title, URL: strings.Repeat("../", len(elems)-i-1) + "index"}
|
||||
links = append(links, link)
|
||||
s += elems[i] + "/"
|
||||
}
|
||||
|
||||
11
page_test.go
11
page_test.go
@@ -1,9 +1,10 @@
|
||||
package main
|
||||
|
||||
import (
|
||||
"github.com/stretchr/testify/assert"
|
||||
"regexp"
|
||||
"testing"
|
||||
|
||||
"github.com/stretchr/testify/assert"
|
||||
)
|
||||
|
||||
func TestPageTitle(t *testing.T) {
|
||||
@@ -58,12 +59,12 @@ And untouchable`)}
|
||||
// "testdata/parents/children/something/index" is a sibling and doesn't count!
|
||||
parents := p.Parents()
|
||||
assert.Equal(t, "Welcome to Oddμ", parents[0].Title)
|
||||
assert.Equal(t, "../../../../index", parents[0].Url)
|
||||
assert.Equal(t, "../../../../index", parents[0].URL)
|
||||
assert.Equal(t, "…", parents[1].Title)
|
||||
assert.Equal(t, "../../../index", parents[1].Url)
|
||||
assert.Equal(t, "../../../index", parents[1].URL)
|
||||
assert.Equal(t, "Solar", parents[2].Title)
|
||||
assert.Equal(t, "../../index", parents[2].Url)
|
||||
assert.Equal(t, "../../index", parents[2].URL)
|
||||
assert.Equal(t, "Lunar", parents[3].Title)
|
||||
assert.Equal(t, "../index", parents[3].Url)
|
||||
assert.Equal(t, "../index", parents[3].URL)
|
||||
assert.Equal(t, 4, len(parents))
|
||||
}
|
||||
|
||||
17
parser.go
17
parser.go
@@ -2,12 +2,13 @@ package main
|
||||
|
||||
import (
|
||||
"bytes"
|
||||
"net/url"
|
||||
"path"
|
||||
|
||||
"github.com/gomarkdown/markdown"
|
||||
"github.com/gomarkdown/markdown/ast"
|
||||
"github.com/gomarkdown/markdown/html"
|
||||
"github.com/gomarkdown/markdown/parser"
|
||||
"net/url"
|
||||
"path"
|
||||
)
|
||||
|
||||
// wikiLink returns an inline parser function. This indirection is
|
||||
@@ -89,12 +90,12 @@ func wikiRenderer() *html.Renderer {
|
||||
return renderer
|
||||
}
|
||||
|
||||
// renderHtml renders the Page.Body to HTML and sets Page.Html, Page.Hashtags, and escapes Page.Name.
|
||||
func (p *Page) renderHtml() {
|
||||
// renderHTML renders the Page.Body to HTML and sets Page.HTML, Page.Hashtags, and escapes Page.Name.
|
||||
func (p *Page) renderHTML() {
|
||||
parser, hashtags := wikiParser()
|
||||
renderer := wikiRenderer()
|
||||
maybeUnsafeHTML := markdown.ToHTML(p.Body, parser, renderer)
|
||||
p.Html = unsafeBytes(maybeUnsafeHTML)
|
||||
p.HTML = unsafeBytes(maybeUnsafeHTML)
|
||||
p.Hashtags = *hashtags
|
||||
}
|
||||
|
||||
@@ -133,8 +134,7 @@ func (p *Page) images() []ImageData {
|
||||
doc := markdown.Parse(p.Body, parser)
|
||||
ast.WalkFunc(doc, func(node ast.Node, entering bool) ast.WalkStatus {
|
||||
if entering {
|
||||
switch v := node.(type) {
|
||||
case *ast.Image:
|
||||
if v, ok := node.(*ast.Image); ok {
|
||||
// not an absolute URL, not a full URL, not a mailto: URI
|
||||
text := toString(v)
|
||||
if len(text) > 0 {
|
||||
@@ -164,8 +164,7 @@ func toString(node ast.Node) string {
|
||||
b := new(bytes.Buffer)
|
||||
ast.WalkFunc(node, func(node ast.Node, entering bool) ast.WalkStatus {
|
||||
if entering {
|
||||
switch v := node.(type) {
|
||||
case *ast.Text:
|
||||
if v, ok := node.(*ast.Text); ok {
|
||||
b.Write(v.Literal)
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1,8 +1,9 @@
|
||||
package main
|
||||
|
||||
import (
|
||||
"github.com/stretchr/testify/assert"
|
||||
"testing"
|
||||
|
||||
"github.com/stretchr/testify/assert"
|
||||
)
|
||||
|
||||
func TestPagePlainText(t *testing.T) {
|
||||
@@ -19,14 +20,14 @@ func TestPageHtml(t *testing.T) {
|
||||
Silver leaves shine bright
|
||||
They droop, boneless, weak and sad
|
||||
A cruel sun stares down`)}
|
||||
p.renderHtml()
|
||||
p.renderHTML()
|
||||
r := `<h1 id="sun">Sun</h1>
|
||||
|
||||
<p>Silver leaves shine bright
|
||||
They droop, boneless, weak and sad
|
||||
A cruel sun stares down</p>
|
||||
`
|
||||
assert.Equal(t, r, string(p.Html))
|
||||
assert.Equal(t, r, string(p.HTML))
|
||||
}
|
||||
|
||||
func TestPageHtmlHashtag(t *testing.T) {
|
||||
@@ -36,7 +37,7 @@ Too faint to focus, so far
|
||||
I am cold, alone
|
||||
|
||||
#Haiku #Cold_Poets`)}
|
||||
p.renderHtml()
|
||||
p.renderHTML()
|
||||
r := `<h1 id="comet">Comet</h1>
|
||||
|
||||
<p>Stars flicker above
|
||||
@@ -45,7 +46,7 @@ I am cold, alone</p>
|
||||
|
||||
<p><a class="tag" href="/search/?q=%23Haiku">#Haiku</a> <a class="tag" href="/search/?q=%23Cold_Poets">#Cold Poets</a></p>
|
||||
`
|
||||
assert.Equal(t, r, string(p.Html))
|
||||
assert.Equal(t, r, string(p.HTML))
|
||||
}
|
||||
|
||||
func TestPageHtmlHashtagCornerCases(t *testing.T) {
|
||||
@@ -53,13 +54,13 @@ func TestPageHtmlHashtagCornerCases(t *testing.T) {
|
||||
|
||||
ok # #o #ok
|
||||
[oh #ok \#nok](ok)`)}
|
||||
p.renderHtml()
|
||||
p.renderHTML()
|
||||
r := `<p>#</p>
|
||||
|
||||
<p>ok # <a class="tag" href="/search/?q=%23o">#o</a> <a class="tag" href="/search/?q=%23ok">#ok</a>
|
||||
<a href="ok">oh #ok #nok</a></p>
|
||||
`
|
||||
assert.Equal(t, r, string(p.Html))
|
||||
assert.Equal(t, r, string(p.HTML))
|
||||
}
|
||||
|
||||
func TestPageHtmlWikiLink(t *testing.T) {
|
||||
@@ -67,14 +68,14 @@ func TestPageHtmlWikiLink(t *testing.T) {
|
||||
Blue and green and black
|
||||
Sky and grass and [ragged cliffs](cliffs)
|
||||
Our [[time together]]`)}
|
||||
p.renderHtml()
|
||||
p.renderHTML()
|
||||
r := `<h1 id="photos-and-books">Photos and Books</h1>
|
||||
|
||||
<p>Blue and green and black
|
||||
Sky and grass and <a href="cliffs">ragged cliffs</a>
|
||||
Our <a href="time%20together">time together</a></p>
|
||||
`
|
||||
assert.Equal(t, r, string(p.Html))
|
||||
assert.Equal(t, r, string(p.HTML))
|
||||
}
|
||||
|
||||
func TestPageHtmlDollar(t *testing.T) {
|
||||
@@ -82,34 +83,34 @@ func TestPageHtmlDollar(t *testing.T) {
|
||||
Dragonfly hovers
|
||||
darts chases turns lands and rests
|
||||
A mighty jewel`)}
|
||||
p.renderHtml()
|
||||
p.renderHTML()
|
||||
r := `<h1 id="no-dollar-can-buy-this">No $dollar$ can buy this</h1>
|
||||
|
||||
<p>Dragonfly hovers
|
||||
darts chases turns lands and rests
|
||||
A mighty jewel</p>
|
||||
`
|
||||
assert.Equal(t, r, string(p.Html))
|
||||
assert.Equal(t, r, string(p.HTML))
|
||||
}
|
||||
|
||||
func TestLazyLoadImages(t *testing.T) {
|
||||
p := &Page{Body: []byte(``)}
|
||||
p.renderHtml()
|
||||
assert.Contains(t, string(p.Html), "lazy")
|
||||
p.renderHTML()
|
||||
assert.Contains(t, string(p.HTML), "lazy")
|
||||
}
|
||||
|
||||
// The fractions available in Latin 1 (?) are rendered.
|
||||
func TestFractions(t *testing.T) {
|
||||
p := &Page{Body: []byte(`1/4`)}
|
||||
p.renderHtml()
|
||||
assert.Contains(t, string(p.Html), "¼")
|
||||
p.renderHTML()
|
||||
assert.Contains(t, string(p.HTML), "¼")
|
||||
}
|
||||
|
||||
// Other fractions are not rendered.
|
||||
func TestNoFractions(t *testing.T) {
|
||||
p := &Page{Body: []byte(`1/6`)}
|
||||
p.renderHtml()
|
||||
assert.Contains(t, string(p.Html), "1/6")
|
||||
p.renderHTML()
|
||||
assert.Contains(t, string(p.HTML), "1/6")
|
||||
}
|
||||
|
||||
// webfinger
|
||||
@@ -119,17 +120,17 @@ func TestAt(t *testing.T) {
|
||||
// prevent lookups
|
||||
accounts.Lock()
|
||||
accounts.uris = make(map[string]string)
|
||||
accounts.uris["alex@alexschroeder.ch"] = "https://social.alexschroeder.ch/@alex";
|
||||
accounts.uris["alex@alexschroeder.ch"] = "https://social.alexschroeder.ch/@alex"
|
||||
accounts.Unlock()
|
||||
// test account
|
||||
p := &Page{Body: []byte(`My fedi handle is @alex@alexschroeder.ch.`)}
|
||||
p.renderHtml()
|
||||
assert.Contains(t,string(p.Html),
|
||||
p.renderHTML()
|
||||
assert.Contains(t, string(p.HTML),
|
||||
`My fedi handle is <a class="account" href="https://social.alexschroeder.ch/@alex" title="@alex@alexschroeder.ch">@alex</a>.`)
|
||||
// test escaped account
|
||||
p = &Page{Body: []byte(`My fedi handle is \@alex@alexschroeder.ch. \`)}
|
||||
p.renderHtml()
|
||||
assert.Contains(t,string(p.Html),
|
||||
p.renderHTML()
|
||||
assert.Contains(t, string(p.HTML),
|
||||
`My fedi handle is @alex@alexschroeder.ch.`)
|
||||
// disable webfinger
|
||||
useWebfinger = false
|
||||
|
||||
@@ -14,12 +14,12 @@ import (
|
||||
// are passed on the the {viewHandler}.
|
||||
func previewHandler(w http.ResponseWriter, r *http.Request, path string) {
|
||||
if r.Method != http.MethodPost {
|
||||
http.Redirect(w, r, "/view/" + strings.TrimPrefix(path, "/preview/"), http.StatusFound)
|
||||
http.Redirect(w, r, "/view/"+strings.TrimPrefix(path, "/preview/"), http.StatusFound)
|
||||
return
|
||||
}
|
||||
body := strings.ReplaceAll(r.FormValue("body"), "\r", "")
|
||||
p := &Page{Name: path, Body: []byte(body)}
|
||||
p.handleTitle(true)
|
||||
p.renderHtml()
|
||||
p.renderHTML()
|
||||
renderTemplate(w, p.Dir(), "preview", p)
|
||||
}
|
||||
|
||||
@@ -23,7 +23,7 @@ img { max-width: 100% }
|
||||
</header>
|
||||
<main>
|
||||
<h1>Previewing {{.Title}}</h1>
|
||||
{{.Html}}
|
||||
{{.HTML}}
|
||||
</main>
|
||||
<hr>
|
||||
<section id="edit">
|
||||
|
||||
@@ -1,10 +1,11 @@
|
||||
package main
|
||||
|
||||
import (
|
||||
"github.com/stretchr/testify/assert"
|
||||
"net/url"
|
||||
"net/http"
|
||||
"net/url"
|
||||
"testing"
|
||||
|
||||
"github.com/stretchr/testify/assert"
|
||||
)
|
||||
|
||||
func TestPreview(t *testing.T) {
|
||||
|
||||
@@ -4,10 +4,6 @@ import (
|
||||
"context"
|
||||
"flag"
|
||||
"fmt"
|
||||
"github.com/google/subcommands"
|
||||
"github.com/hexops/gotextdiff"
|
||||
"github.com/hexops/gotextdiff/myers"
|
||||
"github.com/hexops/gotextdiff/span"
|
||||
"io"
|
||||
"io/fs"
|
||||
"os"
|
||||
@@ -15,6 +11,11 @@ import (
|
||||
"regexp"
|
||||
"slices"
|
||||
"strings"
|
||||
|
||||
"github.com/google/subcommands"
|
||||
"github.com/hexops/gotextdiff"
|
||||
"github.com/hexops/gotextdiff/myers"
|
||||
"github.com/hexops/gotextdiff/span"
|
||||
)
|
||||
|
||||
type replaceCmd struct {
|
||||
@@ -63,9 +64,8 @@ func replaceCli(w io.Writer, isConfirmed bool, isRegexp bool, args []string) sub
|
||||
if fp != "." && strings.HasPrefix(filepath.Base(fp), ".") {
|
||||
if info.IsDir() {
|
||||
return filepath.SkipDir
|
||||
} else {
|
||||
return nil
|
||||
}
|
||||
return nil
|
||||
}
|
||||
// skipp all but page files
|
||||
if !strings.HasSuffix(fp, ".md") {
|
||||
@@ -80,14 +80,14 @@ func replaceCli(w io.Writer, isConfirmed bool, isRegexp bool, args []string) sub
|
||||
changes++
|
||||
if isConfirmed {
|
||||
fmt.Fprintln(w, fp)
|
||||
_ = os.Rename(fp, fp + "~")
|
||||
_ = os.Rename(fp, fp+"~")
|
||||
err = os.WriteFile(fp, result, 0644)
|
||||
if err != nil {
|
||||
return err
|
||||
}
|
||||
} else {
|
||||
edits := myers.ComputeEdits(span.URIFromPath(fp + "~"), string(body), string(result))
|
||||
diff := fmt.Sprint(gotextdiff.ToUnified(fp + "~", fp, string(body), edits))
|
||||
edits := myers.ComputeEdits(span.URIFromPath(fp+"~"), string(body), string(result))
|
||||
diff := fmt.Sprint(gotextdiff.ToUnified(fp+"~", fp, string(body), edits))
|
||||
fmt.Fprintln(w, diff)
|
||||
}
|
||||
}
|
||||
|
||||
@@ -2,9 +2,10 @@ package main
|
||||
|
||||
import (
|
||||
"bytes"
|
||||
"testing"
|
||||
|
||||
"github.com/google/subcommands"
|
||||
"github.com/stretchr/testify/assert"
|
||||
"testing"
|
||||
)
|
||||
|
||||
func TestReplaceCmd(t *testing.T) {
|
||||
|
||||
18
search.go
18
search.go
@@ -67,11 +67,12 @@ func sortNames(tokens []string) func(a, b string) int {
|
||||
na := unicode.IsNumber(ra)
|
||||
rb, _ := utf8.DecodeRuneInString(b)
|
||||
nb := unicode.IsNumber(rb)
|
||||
if na && !nb {
|
||||
switch {
|
||||
case na && !nb:
|
||||
return -1
|
||||
} else if !na && nb {
|
||||
case !na && nb:
|
||||
return 1
|
||||
} else if na && nb {
|
||||
case na && nb:
|
||||
if a < b {
|
||||
return 1
|
||||
} else if a > b {
|
||||
@@ -99,7 +100,7 @@ func sortNames(tokens []string) func(a, b string) int {
|
||||
// results.
|
||||
const itemsPerPage = 20
|
||||
|
||||
// search returns a sorted []Page where each page contains an extract of the actual Page.Body in its Page.Html. Page
|
||||
// search returns a sorted []Page where each page contains an extract of the actual Page.Body in its Page.HTML. Page
|
||||
// size is 20. Specify either the page number to return, or that all the results should be returned. Only ask for all
|
||||
// results if runtime is not an issue, like on the command line. The boolean return value indicates whether there are
|
||||
// more results.
|
||||
@@ -140,7 +141,7 @@ func search(q, dir, filter string, page int, all bool) ([]*Result, bool) {
|
||||
if strings.Contains(title, term) {
|
||||
re, err := re(term)
|
||||
if err == nil {
|
||||
img.Html = template.HTML(highlight(re, img.Title))
|
||||
img.HTML = template.HTML(highlight(re, img.Title))
|
||||
}
|
||||
res = append(res, img)
|
||||
continue ImageLoop
|
||||
@@ -199,14 +200,15 @@ func filterNames(names, predicates []string) []string {
|
||||
defer index.RUnlock()
|
||||
for _, predicate := range predicates {
|
||||
r := make([]string, 0)
|
||||
if strings.HasPrefix(predicate, "title:") {
|
||||
switch {
|
||||
case strings.HasPrefix(predicate, "title:"):
|
||||
token := predicate[6:]
|
||||
for _, name := range names {
|
||||
if strings.Contains(strings.ToLower(index.titles[name]), token) {
|
||||
r = append(r, name)
|
||||
}
|
||||
}
|
||||
} else if predicate == "blog:true" || predicate == "blog:false" {
|
||||
case predicate == "blog:true" || predicate == "blog:false":
|
||||
blog := predicate == "blog:true"
|
||||
re := regexp.MustCompile(`(^|/)\d\d\d\d-\d\d-\d\d`)
|
||||
for _, name := range names {
|
||||
@@ -215,7 +217,7 @@ func filterNames(names, predicates []string) []string {
|
||||
r = append(r, name)
|
||||
}
|
||||
}
|
||||
} else {
|
||||
default:
|
||||
log.Printf("Unsupported predicate: %s", predicate)
|
||||
}
|
||||
names = intersection(names, r)
|
||||
|
||||
@@ -40,9 +40,9 @@ button { background-color: #eee; color: inherit; border-radius: 4px; border-widt
|
||||
<article lang="{{.Language}}">
|
||||
<p><a class="result" href="/view/{{.Path}}">{{.Title}}</a>
|
||||
<span class="score">{{.Score}}</span></p>
|
||||
<blockquote>{{.Html}}</blockquote>
|
||||
<blockquote>{{.HTML}}</blockquote>
|
||||
{{range .Images}}
|
||||
<p class="image"><a href="/view/{{.Path}}"><img loading="lazy" src="/view/{{.Path}}"></a><br/>{{.Html}}
|
||||
<p class="image"><a href="/view/{{.Path}}"><img loading="lazy" src="/view/{{.Path}}"></a><br/>{{.HTML}}
|
||||
{{end}}
|
||||
</article>
|
||||
{{end}}
|
||||
|
||||
@@ -4,14 +4,15 @@ import (
|
||||
"context"
|
||||
"flag"
|
||||
"fmt"
|
||||
"github.com/google/subcommands"
|
||||
"github.com/muesli/reflow/wordwrap"
|
||||
"io"
|
||||
"net/url"
|
||||
"os"
|
||||
"path/filepath"
|
||||
"regexp"
|
||||
"strings"
|
||||
|
||||
"github.com/google/subcommands"
|
||||
"github.com/muesli/reflow/wordwrap"
|
||||
)
|
||||
|
||||
type searchCmd struct {
|
||||
@@ -70,14 +71,15 @@ func searchCli(w io.Writer, cmd *searchCmd, args []string) subcommands.ExitStatu
|
||||
fmt.Fprint(os.Stderr, " results\n")
|
||||
}
|
||||
}
|
||||
if cmd.extract {
|
||||
switch {
|
||||
case cmd.extract:
|
||||
searchExtract(w, items)
|
||||
} else if cmd.files {
|
||||
case cmd.files:
|
||||
for _, p := range items {
|
||||
name := filepath.FromSlash(p.Name) + ".md\n"
|
||||
fmt.Fprintf(w, name)
|
||||
fmt.Fprint(w, name)
|
||||
}
|
||||
} else {
|
||||
default:
|
||||
for _, p := range items {
|
||||
name := p.Name
|
||||
if strings.HasPrefix(name, dir) {
|
||||
@@ -98,7 +100,7 @@ func searchExtract(w io.Writer, items []*Result) {
|
||||
match := func(s string) string { return "\x1b[1m" + s + "\x1b[0m" } // bold
|
||||
re := regexp.MustCompile(`<b>(.*?)</b>`)
|
||||
for _, p := range items {
|
||||
s := re.ReplaceAllString(string(p.Html), match(`$1`))
|
||||
s := re.ReplaceAllString(string(p.HTML), match(`$1`))
|
||||
fmt.Fprintln(w, heading(p.Title))
|
||||
if p.Name != p.Title {
|
||||
fmt.Fprintln(w, p.Name)
|
||||
|
||||
@@ -2,9 +2,10 @@ package main
|
||||
|
||||
import (
|
||||
"bytes"
|
||||
"testing"
|
||||
|
||||
"github.com/google/subcommands"
|
||||
"github.com/stretchr/testify/assert"
|
||||
"testing"
|
||||
)
|
||||
|
||||
func TestSearchCmd(t *testing.T) {
|
||||
|
||||
@@ -2,11 +2,12 @@ package main
|
||||
|
||||
import (
|
||||
"fmt"
|
||||
"github.com/stretchr/testify/assert"
|
||||
"net/http"
|
||||
"net/url"
|
||||
"slices"
|
||||
"testing"
|
||||
|
||||
"github.com/stretchr/testify/assert"
|
||||
)
|
||||
|
||||
func TestSortNames(t *testing.T) {
|
||||
@@ -262,7 +263,7 @@ Please call me, my love.
|
||||
|
||||
assert.NotEmpty(t, items[0].Images)
|
||||
assert.Equal(t, "phone call", items[0].Images[0].Title)
|
||||
assert.Equal(t, "phone <b>call</b>", string(items[0].Images[0].Html))
|
||||
assert.Equal(t, "phone <b>call</b>", string(items[0].Images[0].HTML))
|
||||
assert.Equal(t, "testdata/images/2024-07-21.jpg", items[0].Images[0].Name)
|
||||
|
||||
assert.Empty(t, items[1].Images)
|
||||
|
||||
56
sitemap.go
Normal file
56
sitemap.go
Normal file
@@ -0,0 +1,56 @@
|
||||
package main
|
||||
|
||||
import (
|
||||
"fmt"
|
||||
"log"
|
||||
"net/http"
|
||||
"os"
|
||||
"regexp"
|
||||
)
|
||||
|
||||
// SitemapURL is a URL of the sitemap.
|
||||
type SitemapURL struct {
|
||||
// Loc is the actual location.
|
||||
Loc string
|
||||
}
|
||||
|
||||
// Sitemap is the sitemap itself, containing a list of URLs.
|
||||
type Sitemap struct {
|
||||
URL []*SitemapURL
|
||||
}
|
||||
|
||||
// sitemapHandler lists all the pages. See https://www.sitemaps.org/protocol.html for more. It takes the
|
||||
// ODDMU_FILTER environment variable into account.
|
||||
func sitemapHandler(w http.ResponseWriter, r *http.Request) {
|
||||
if r.URL.Path != "/sitemap.xml" {
|
||||
http.NotFound(w, r)
|
||||
} else {
|
||||
w.Write([]byte(`<?xml version="1.0" encoding="UTF-8"?>` + "\n"))
|
||||
scheme := "http"
|
||||
if r.TLS != nil {
|
||||
scheme += "s"
|
||||
}
|
||||
base := fmt.Sprintf("%s://%s/view/", scheme, r.Host)
|
||||
filter := os.Getenv("ODDMU_FILTER")
|
||||
renderTemplate(w, ".", "sitemap", sitemap(&index, base, filter))
|
||||
}
|
||||
}
|
||||
|
||||
// sitemap generates the list of URLs. A reference to the index needs to be provided to make it easier to write
|
||||
// tests. Exclude pages matching the filter.
|
||||
func sitemap(idx *indexStore, base, filter string) Sitemap {
|
||||
url := make([]*SitemapURL, 0)
|
||||
re, err := regexp.Compile(filter)
|
||||
if err != nil {
|
||||
log.Println("ODDMU_FILTER does not compile:", filter, err)
|
||||
return Sitemap{URL: url}
|
||||
}
|
||||
idx.RLock()
|
||||
defer idx.RUnlock()
|
||||
for name := range idx.titles {
|
||||
if filter == "" || !re.MatchString(name) {
|
||||
url = append(url, &SitemapURL{Loc: base + name})
|
||||
}
|
||||
}
|
||||
return Sitemap{URL: url}
|
||||
}
|
||||
3
sitemap.html
Normal file
3
sitemap.html
Normal file
@@ -0,0 +1,3 @@
|
||||
<urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9">
|
||||
{{range .URL}}<url><loc>{{.Loc}}</loc></url>
|
||||
{{end}}</urlset>
|
||||
62
sitemap_cmd.go
Normal file
62
sitemap_cmd.go
Normal file
@@ -0,0 +1,62 @@
|
||||
package main
|
||||
|
||||
import (
|
||||
"context"
|
||||
"flag"
|
||||
"fmt"
|
||||
"io"
|
||||
"log"
|
||||
"os"
|
||||
|
||||
"github.com/google/subcommands"
|
||||
)
|
||||
|
||||
type sitemapCmd struct {
|
||||
base string
|
||||
filter string
|
||||
}
|
||||
|
||||
func (cmd *sitemapCmd) SetFlags(f *flag.FlagSet) {
|
||||
f.StringVar(&cmd.base, "base", "http://localhost:8080/view/", "the base URL for the sitemap")
|
||||
f.StringVar(&cmd.filter, "filter", "", "a regular expression to filter pages")
|
||||
}
|
||||
|
||||
func (*sitemapCmd) Name() string { return "sitemap" }
|
||||
func (*sitemapCmd) Synopsis() string { return "list all the pages known in Sitemap format" }
|
||||
func (*sitemapCmd) Usage() string {
|
||||
return `sitemap [-base URL] [-filter regex]:
|
||||
Print all the pages known in Sitemap format.
|
||||
See https://www.sitemaps.org/ for more.
|
||||
`
|
||||
}
|
||||
|
||||
func (cmd *sitemapCmd) Execute(_ context.Context, f *flag.FlagSet, _ ...interface{}) subcommands.ExitStatus {
|
||||
n, err := index.load()
|
||||
if err != nil {
|
||||
fmt.Fprintf(os.Stderr, "Index load: %s\n", err)
|
||||
return subcommands.ExitFailure
|
||||
}
|
||||
fmt.Fprintf(os.Stderr, "Indexed %d pages\n", n)
|
||||
return sitemapCli(os.Stdout, &index, cmd.base, cmd.filter)
|
||||
}
|
||||
|
||||
// sitemapCli implements the printing of a Sitemap. In order to make testing easier, it takes a Writer and an
|
||||
// indexStore. The Writer is important so that test code can provide a buffer instead of os.Stdout; the indexStore
|
||||
// is important so that test code can ensure no other test running in parallel can interfere with the list of known
|
||||
// pages (by adding or deleting pages).
|
||||
func sitemapCli(w io.Writer, idx *indexStore, base, filter string) subcommands.ExitStatus {
|
||||
initTemplates()
|
||||
template := "sitemap.html"
|
||||
t := templates.template[template]
|
||||
if t == nil {
|
||||
log.Println("Template not found:", template)
|
||||
return subcommands.ExitFailure
|
||||
}
|
||||
w.Write([]byte(`<?xml version="1.0" encoding="UTF-8"?>` + "\n"))
|
||||
err := t.Execute(w, sitemap(idx, base, filter))
|
||||
if err != nil {
|
||||
log.Println(err)
|
||||
return subcommands.ExitFailure
|
||||
}
|
||||
return subcommands.ExitSuccess
|
||||
}
|
||||
18
sitemap_cmd_test.go
Normal file
18
sitemap_cmd_test.go
Normal file
@@ -0,0 +1,18 @@
|
||||
package main
|
||||
|
||||
import (
|
||||
"bytes"
|
||||
"testing"
|
||||
|
||||
"github.com/google/subcommands"
|
||||
"github.com/stretchr/testify/assert"
|
||||
)
|
||||
|
||||
func TestSitemapCmd(t *testing.T) {
|
||||
b := new(bytes.Buffer)
|
||||
s := sitemapCli(b, minimalIndex(t), "https://example.org/view/", "^themes/")
|
||||
assert.Equal(t, subcommands.ExitSuccess, s)
|
||||
assert.Contains(t, b.String(), "https://example.org/view/index")
|
||||
assert.Contains(t, b.String(), "https://example.org/view/README")
|
||||
assert.NotContains(t, b.String(), "https://example.org/view/themes/")
|
||||
}
|
||||
@@ -90,9 +90,9 @@ func snippets(q string, s string) string {
|
||||
}
|
||||
}
|
||||
t = s[start:end]
|
||||
res = res + t
|
||||
res += t
|
||||
if len(s) > end {
|
||||
res = res + " …"
|
||||
res += " …"
|
||||
}
|
||||
// truncate text to avoid rematching the same string.
|
||||
s = s[end:]
|
||||
|
||||
@@ -1,8 +1,9 @@
|
||||
package main
|
||||
|
||||
import (
|
||||
"github.com/stretchr/testify/assert"
|
||||
"testing"
|
||||
|
||||
"github.com/stretchr/testify/assert"
|
||||
)
|
||||
|
||||
func TestSnippets(t *testing.T) {
|
||||
|
||||
@@ -4,6 +4,7 @@
|
||||
<meta charset="utf-8">
|
||||
<meta name="format-detection" content="telephone=no">
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1.0, user-scalable=no">
|
||||
<meta name="generator" content="Oddμ <https://src.alexschroeder.ch/oddmu.git/>"/>
|
||||
<title>{{.Title}}</title>
|
||||
<style>
|
||||
html { max-width: 65ch; padding: 1ch; margin: auto; color: #111; background-color: #ffe }
|
||||
@@ -19,7 +20,7 @@ img { max-width: 100% }
|
||||
<body>
|
||||
<main id="main">
|
||||
<h1>{{.Title}}</h1>
|
||||
{{.Html}}
|
||||
{{.HTML}}
|
||||
</main>
|
||||
<footer>
|
||||
<address>
|
||||
|
||||
145
static_cmd.go
145
static_cmd.go
@@ -5,10 +5,6 @@ import (
|
||||
"context"
|
||||
"flag"
|
||||
"fmt"
|
||||
"github.com/gomarkdown/markdown"
|
||||
"github.com/gomarkdown/markdown/ast"
|
||||
"github.com/gomarkdown/markdown/html"
|
||||
"github.com/google/subcommands"
|
||||
"io/fs"
|
||||
"net/url"
|
||||
"os"
|
||||
@@ -16,14 +12,28 @@ import (
|
||||
"slices"
|
||||
"strings"
|
||||
"time"
|
||||
|
||||
"github.com/gomarkdown/markdown"
|
||||
"github.com/gomarkdown/markdown/ast"
|
||||
"github.com/gomarkdown/markdown/html"
|
||||
"github.com/google/subcommands"
|
||||
)
|
||||
|
||||
var shrinkWidth = 800
|
||||
var shrinkQuality = 30
|
||||
|
||||
type staticCmd struct {
|
||||
jobs int
|
||||
jobs int
|
||||
shrink bool
|
||||
glob string
|
||||
verbose bool
|
||||
}
|
||||
|
||||
func (cmd *staticCmd) SetFlags(f *flag.FlagSet) {
|
||||
f.IntVar(&cmd.jobs, "jobs", 2, "how many jobs to use")
|
||||
f.BoolVar(&cmd.shrink, "shrink", false, "shrink images by decreasing the quality")
|
||||
f.StringVar(&cmd.glob, "glob", "", "only export files matching this shell file name pattern")
|
||||
f.BoolVar(&cmd.verbose, "verbose", false, "print the files as they are being processed")
|
||||
}
|
||||
|
||||
func (*staticCmd) Name() string { return "static" }
|
||||
@@ -38,11 +48,11 @@ func (*staticCmd) Usage() string {
|
||||
func (cmd *staticCmd) Execute(_ context.Context, f *flag.FlagSet, _ ...interface{}) subcommands.ExitStatus {
|
||||
args := f.Args()
|
||||
if len(args) != 1 {
|
||||
fmt.Fprintln(os.Stderr, "Exactly one target directory is required")
|
||||
fmt.Fprintln(os.Stderr, "Exactly one target directory is require", args)
|
||||
return subcommands.ExitFailure
|
||||
}
|
||||
dir := filepath.Clean(args[0])
|
||||
return staticCli(".", dir, cmd.jobs, false)
|
||||
return staticCli(".", dir, cmd.jobs, cmd.glob, cmd.shrink, cmd.verbose, false)
|
||||
}
|
||||
|
||||
type args struct {
|
||||
@@ -52,20 +62,20 @@ type args struct {
|
||||
|
||||
// staticCli generates a static site in the designated directory. The quiet flag is used to suppress output when running
|
||||
// tests. The source directory cannot be set from the command-line. The current directory (".") is assumed.
|
||||
func staticCli(source, target string, jobs int, quiet bool) subcommands.ExitStatus {
|
||||
func staticCli(source, target string, jobs int, glob string, shrink, verbose, quiet bool) subcommands.ExitStatus {
|
||||
index.load()
|
||||
index.RLock()
|
||||
defer index.RUnlock()
|
||||
loadLanguages()
|
||||
loadTemplates()
|
||||
tasks := make(chan args)
|
||||
results := make(chan error)
|
||||
done := make(chan bool)
|
||||
initTemplates()
|
||||
tasks := make(chan args, 10000)
|
||||
results := make(chan error, jobs)
|
||||
done := make(chan bool, jobs)
|
||||
stop := make(chan error)
|
||||
for i := 0; i < jobs; i++ {
|
||||
go staticWorker(tasks, results, done)
|
||||
go staticWorker(tasks, results, done, shrink, verbose)
|
||||
}
|
||||
go staticWalk(source, target, tasks, stop)
|
||||
go staticWalk(source, target, glob, tasks, stop)
|
||||
go staticWatch(jobs, results, done)
|
||||
n, err := staticProgressIndicator(results, stop, quiet)
|
||||
if !quiet {
|
||||
@@ -81,13 +91,15 @@ func staticCli(source, target string, jobs int, quiet bool) subcommands.ExitStat
|
||||
// staticWalk walks the source directory tree. Any directory it finds, it recreates in the target directory. Any file it
|
||||
// finds, it puts into the tasks channel for the staticWorker. When the directory walk is finished, the tasks channel is
|
||||
// closed. If there's an error on the stop channel, the walk returns that error.
|
||||
func staticWalk(source, target string, tasks chan (args), stop chan (error)) {
|
||||
func staticWalk(source, target, glob string, tasks chan (args), stop chan (error)) {
|
||||
// The error returned here is what's in the stop channel but at the very end, a worker might return an error
|
||||
// even though the walk is already done. This is why we cannot rely on the return value of the walk.
|
||||
filepath.Walk(source, func(fp string, info fs.FileInfo, err error) error {
|
||||
n := 0
|
||||
err := filepath.Walk(source, func(fp string, info fs.FileInfo, err error) error {
|
||||
if err != nil {
|
||||
return err
|
||||
}
|
||||
// don't wait for the stop channel
|
||||
select {
|
||||
case err := <-stop:
|
||||
return err
|
||||
@@ -96,14 +108,30 @@ func staticWalk(source, target string, tasks chan (args), stop chan (error)) {
|
||||
if fp != "." && strings.HasPrefix(filepath.Base(fp), ".") {
|
||||
if info.IsDir() {
|
||||
return filepath.SkipDir
|
||||
} else {
|
||||
return nil
|
||||
}
|
||||
return nil
|
||||
}
|
||||
// skip backup files, avoid recursion
|
||||
if strings.HasSuffix(fp, "~") || strings.HasPrefix(fp, target) {
|
||||
return nil
|
||||
}
|
||||
// skip templates
|
||||
if slices.Contains(templateFiles, filepath.Base(fp)) {
|
||||
return nil
|
||||
}
|
||||
// skip files that don't match the glob, if set
|
||||
if fp != "." && glob != "" {
|
||||
match, err := filepath.Match(glob, fp)
|
||||
if err != nil {
|
||||
return err // abort
|
||||
}
|
||||
if !match {
|
||||
if info.IsDir() {
|
||||
return filepath.SkipDir
|
||||
}
|
||||
return nil
|
||||
}
|
||||
}
|
||||
// determine the actual target: if source is a/ and target is b/ and path is a/file, then the
|
||||
// target is b/file
|
||||
var actualTarget string
|
||||
@@ -115,18 +143,32 @@ func staticWalk(source, target string, tasks chan (args), stop chan (error)) {
|
||||
}
|
||||
actualTarget = filepath.Join(target, fp[len(source):])
|
||||
}
|
||||
// recreate subdirectories
|
||||
// recreate subdirectories, ignore existing ones
|
||||
if info.IsDir() {
|
||||
return os.Mkdir(actualTarget, 0755)
|
||||
os.Mkdir(actualTarget, 0755)
|
||||
return nil
|
||||
}
|
||||
// Markdown files end up as HTML files
|
||||
if strings.HasSuffix(actualTarget, ".md") {
|
||||
actualTarget = actualTarget[:len(actualTarget)-3] + ".html"
|
||||
}
|
||||
// do the task if the target file doesn't exist or if the source file is newer
|
||||
other, err := os.Stat(actualTarget)
|
||||
if err != nil || info.ModTime().After(other.ModTime()) {
|
||||
if err == nil {
|
||||
fmt.Println(fp, info.ModTime(), other.ModTime(), info.ModTime().After(other.ModTime()))
|
||||
}
|
||||
n++
|
||||
tasks <- args{source: fp, target: actualTarget, info: info}
|
||||
}
|
||||
return nil
|
||||
}
|
||||
})
|
||||
if err != nil {
|
||||
fmt.Println(err)
|
||||
} else {
|
||||
fmt.Printf("\r%d files to process\n", n)
|
||||
}
|
||||
close(tasks)
|
||||
}
|
||||
|
||||
@@ -141,57 +183,41 @@ func staticWatch(jobs int, results chan (error), done chan (bool)) {
|
||||
|
||||
// staticWorker takes arguments off the tasks channel (the file to process) and put results in the results channel (any
|
||||
// errors encountered); when they're done they send true on the done channel.
|
||||
func staticWorker(tasks chan (args), results chan (error), done chan (bool)) {
|
||||
func staticWorker(tasks chan (args), results chan (error), done chan (bool), shrink, verbose bool) {
|
||||
task, ok := <-tasks
|
||||
for ok {
|
||||
results <- staticFile(task.source, task.target, task.info)
|
||||
if verbose {
|
||||
fmt.Println(task.source)
|
||||
}
|
||||
results <- staticFile(task.source, task.target, task.info, shrink)
|
||||
task, ok = <-tasks
|
||||
}
|
||||
done <- true
|
||||
}
|
||||
|
||||
// staticProgressIndicator watches the results channel and does a countdown. If the result channel reports an error,
|
||||
// that is put into the stop channel so that staticWalk stops adding to the tasks channel.
|
||||
// staticProgressIndicator watches the results channel and prints a running count. If the result channel reports an
|
||||
// error, that is put into the stop channel so that staticWalk stops adding to the tasks channel.
|
||||
func staticProgressIndicator(results chan (error), stop chan (error), quiet bool) (int, error) {
|
||||
n := 0
|
||||
t := time.Now()
|
||||
var err error
|
||||
for result := range results {
|
||||
if result != nil {
|
||||
err := result
|
||||
// this stops the walker from adding more tasks
|
||||
stop <- err
|
||||
} else {
|
||||
n++
|
||||
if !quiet && n%13 == 0 {
|
||||
if time.Since(t) > time.Second {
|
||||
fmt.Printf("\r%d", n)
|
||||
t = time.Now()
|
||||
}
|
||||
err, ok := <-results
|
||||
for ok && err == nil {
|
||||
n++
|
||||
if !quiet && n%13 == 0 {
|
||||
if time.Since(t) > time.Second {
|
||||
fmt.Printf("\r%d", n)
|
||||
t = time.Now()
|
||||
}
|
||||
}
|
||||
err, ok = <-results
|
||||
}
|
||||
if ok && err != nil {
|
||||
// this stops the walker from adding more tasks
|
||||
stop <- err
|
||||
}
|
||||
return n, err
|
||||
}
|
||||
|
||||
// staticFile is used to walk the file trees and do the right thing for the destination directory: create
|
||||
// subdirectories, link files, render HTML files.
|
||||
func staticFile(source, target string, info fs.FileInfo) error {
|
||||
// render pages
|
||||
if strings.HasSuffix(source, ".md") {
|
||||
p, err := staticPage(source[:len(source)-3], target[:len(target)-3]+".html")
|
||||
if err != nil {
|
||||
return err
|
||||
}
|
||||
return staticFeed(source[:len(source)-3], target[:len(target)-3]+".rss", p, info.ModTime())
|
||||
}
|
||||
// remaining files are linked unless this is a template
|
||||
if slices.Contains(templateFiles, filepath.Base(source)) {
|
||||
return nil
|
||||
}
|
||||
return os.Link(source, target)
|
||||
}
|
||||
|
||||
// staticPage takes the filename of a page (ending in ".md") and generates a static HTML page.
|
||||
func staticPage(source, target string) (*Page, error) {
|
||||
p, err := loadPage(filepath.ToSlash(source))
|
||||
@@ -200,7 +226,7 @@ func staticPage(source, target string) (*Page, error) {
|
||||
return nil, err
|
||||
}
|
||||
p.handleTitle(true)
|
||||
// instead of p.renderHtml() we do it all ourselves, appending ".html" to all the local links
|
||||
// instead of p.renderHTML() we do it all ourselves, appending ".html" to all the local links
|
||||
parser, hashtags := wikiParser()
|
||||
doc := markdown.Parse(p.Body, parser)
|
||||
ast.WalkFunc(doc, staticLinks)
|
||||
@@ -210,7 +236,7 @@ func staticPage(source, target string) (*Page, error) {
|
||||
}
|
||||
renderer := html.NewRenderer(opts)
|
||||
maybeUnsafeHTML := markdown.Render(doc, renderer)
|
||||
p.Html = unsafeBytes(maybeUnsafeHTML)
|
||||
p.HTML = unsafeBytes(maybeUnsafeHTML)
|
||||
p.Hashtags = *hashtags
|
||||
return p, write(p, target, "", "static.html")
|
||||
}
|
||||
@@ -221,7 +247,7 @@ func staticFeed(source, target string, p *Page, ti time.Time) error {
|
||||
base := filepath.Base(source)
|
||||
_, ok := index.token[strings.ToLower(base)]
|
||||
if base == "index" || ok {
|
||||
f := feed(p, ti)
|
||||
f := feed(p, ti, 0, 10, ModTime)
|
||||
if len(f.Items) > 0 {
|
||||
return write(f, target, `<?xml version="1.0" encoding="UTF-8"?>`, "feed.html")
|
||||
}
|
||||
@@ -232,8 +258,7 @@ func staticFeed(source, target string, p *Page, ti time.Time) error {
|
||||
// staticLinks checks a node and if it is a link to a local page, it appends ".html" to the link destination.
|
||||
func staticLinks(node ast.Node, entering bool) ast.WalkStatus {
|
||||
if entering {
|
||||
switch v := node.(type) {
|
||||
case *ast.Link:
|
||||
if v, ok := node.(*ast.Link); ok {
|
||||
// not an absolute URL, not a full URL, not a mailto: URI
|
||||
if !bytes.HasPrefix(v.Destination, []byte("/")) &&
|
||||
!bytes.Contains(v.Destination, []byte("://")) &&
|
||||
|
||||
89
static_cmd_common.go
Normal file
89
static_cmd_common.go
Normal file
@@ -0,0 +1,89 @@
|
||||
//go:build !openbsd
|
||||
|
||||
package main
|
||||
|
||||
import (
|
||||
"fmt"
|
||||
"image/jpeg"
|
||||
"io"
|
||||
"io/fs"
|
||||
"os"
|
||||
"path/filepath"
|
||||
"strings"
|
||||
|
||||
"github.com/disintegration/imaging"
|
||||
"github.com/edwvee/exiffix"
|
||||
"github.com/gen2brain/webp"
|
||||
)
|
||||
|
||||
// staticFile is used to walk the file trees and do the right thing for the destination directory: create
|
||||
// subdirectories, link files, render HTML files.
|
||||
func staticFile(source, target string, info fs.FileInfo, shrink bool) error {
|
||||
// render pages
|
||||
if strings.HasSuffix(source, ".md") {
|
||||
// target already has ".html" extension
|
||||
p, err := staticPage(source[:len(source)-3], target)
|
||||
if err != nil {
|
||||
return err
|
||||
}
|
||||
return staticFeed(source[:len(source)-3], target[:len(target)-5]+".rss", p, info.ModTime())
|
||||
}
|
||||
if shrink {
|
||||
switch filepath.Ext(source) {
|
||||
case ".jpg", ".jpeg", ".webp":
|
||||
return shrinkImage(source, target, info)
|
||||
}
|
||||
}
|
||||
// delete before linking, ignore errors
|
||||
os.Remove(target)
|
||||
err := os.Link(source, target)
|
||||
if err == nil {
|
||||
return nil
|
||||
}
|
||||
// in case of invalid cross-device link error, copy file instead
|
||||
src, err := os.Open(source)
|
||||
if err != nil {
|
||||
return err
|
||||
}
|
||||
defer src.Close()
|
||||
dst, err := os.Create(target)
|
||||
if err != nil {
|
||||
return err
|
||||
}
|
||||
defer dst.Close()
|
||||
_, err = io.Copy(dst, src)
|
||||
return err
|
||||
}
|
||||
|
||||
// shrink Image shrinks images down and reduces the quality dramatically.
|
||||
func shrinkImage(source, target string, info fs.FileInfo) error {
|
||||
file, err := os.Open(source)
|
||||
if err != nil {
|
||||
return err
|
||||
}
|
||||
defer file.Close()
|
||||
img, _, err := exiffix.Decode(file)
|
||||
if err != nil {
|
||||
return fmt.Errorf("%s cannot be decoded", source)
|
||||
}
|
||||
if img.Bounds().Dx() > shrinkWidth {
|
||||
res := imaging.Resize(img, shrinkWidth, 0, imaging.Lanczos) // preserve aspect ratio
|
||||
// imaging functions don't return errors but empty images…
|
||||
if res.Rect.Empty() {
|
||||
return fmt.Errorf("%s cannot be resized", source)
|
||||
}
|
||||
img = res
|
||||
}
|
||||
dst, err := os.Create(target)
|
||||
if err != nil {
|
||||
return err
|
||||
}
|
||||
defer dst.Close()
|
||||
switch filepath.Ext(source) {
|
||||
case ".jpg", ".jpeg":
|
||||
err = jpeg.Encode(dst, img, &jpeg.Options{Quality: shrinkQuality})
|
||||
case ".webp":
|
||||
err = webp.Encode(dst, img, webp.Options{Quality: shrinkQuality})
|
||||
}
|
||||
return err
|
||||
}
|
||||
86
static_cmd_nowebp.go
Normal file
86
static_cmd_nowebp.go
Normal file
@@ -0,0 +1,86 @@
|
||||
//go:build openbsd
|
||||
|
||||
package main
|
||||
|
||||
import (
|
||||
"fmt"
|
||||
"image/jpeg"
|
||||
"io"
|
||||
"io/fs"
|
||||
"os"
|
||||
"path/filepath"
|
||||
"strings"
|
||||
|
||||
"github.com/disintegration/imaging"
|
||||
"github.com/edwvee/exiffix"
|
||||
)
|
||||
|
||||
// staticFile is used to walk the file trees and do the right thing for the destination directory: create
|
||||
// subdirectories, link files, render HTML files.
|
||||
func staticFile(source, target string, info fs.FileInfo, shrink bool) error {
|
||||
// render pages
|
||||
if strings.HasSuffix(source, ".md") {
|
||||
// target already has ".html" extension
|
||||
p, err := staticPage(source[:len(source)-3], target)
|
||||
if err != nil {
|
||||
return err
|
||||
}
|
||||
return staticFeed(source[:len(source)-3], target[:len(target)-5]+".rss", p, info.ModTime())
|
||||
}
|
||||
if shrink {
|
||||
switch filepath.Ext(source) {
|
||||
case ".jpg", ".jpeg":
|
||||
return shrinkImage(source, target, info)
|
||||
}
|
||||
}
|
||||
// delete before linking, ignore errors
|
||||
os.Remove(target)
|
||||
err := os.Link(source, target)
|
||||
if err == nil {
|
||||
return nil
|
||||
}
|
||||
// in case of invalid cross-device link error, copy file instead
|
||||
src, err := os.Open(source)
|
||||
if err != nil {
|
||||
return err
|
||||
}
|
||||
defer src.Close()
|
||||
dst, err := os.Create(target)
|
||||
if err != nil {
|
||||
return err
|
||||
}
|
||||
defer dst.Close()
|
||||
_, err = io.Copy(dst, src)
|
||||
return err
|
||||
}
|
||||
|
||||
// shrink Image shrinks images down and reduces the quality dramatically.
|
||||
func shrinkImage(source, target string, info fs.FileInfo) error {
|
||||
file, err := os.Open(source)
|
||||
if err != nil {
|
||||
return err
|
||||
}
|
||||
defer file.Close()
|
||||
img, _, err := exiffix.Decode(file)
|
||||
if err != nil {
|
||||
return fmt.Errorf("%s cannot be decoded", source)
|
||||
}
|
||||
if img.Bounds().Dx() > shrinkWidth {
|
||||
res := imaging.Resize(img, shrinkWidth, 0, imaging.Lanczos) // preserve aspect ratio
|
||||
// imaging functions don't return errors but empty images…
|
||||
if res.Rect.Empty() {
|
||||
return fmt.Errorf("%s cannot be resized", source)
|
||||
}
|
||||
img = res
|
||||
}
|
||||
dst, err := os.Create(target)
|
||||
if err != nil {
|
||||
return err
|
||||
}
|
||||
defer dst.Close()
|
||||
switch filepath.Ext(source) {
|
||||
case ".jpg", ".jpeg":
|
||||
err = jpeg.Encode(dst, img, &jpeg.Options{Quality: shrinkQuality})
|
||||
}
|
||||
return err
|
||||
}
|
||||
@@ -1,15 +1,16 @@
|
||||
package main
|
||||
|
||||
import (
|
||||
"github.com/google/subcommands"
|
||||
"github.com/stretchr/testify/assert"
|
||||
"os"
|
||||
"testing"
|
||||
|
||||
"github.com/google/subcommands"
|
||||
"github.com/stretchr/testify/assert"
|
||||
)
|
||||
|
||||
func TestStaticCmd(t *testing.T) {
|
||||
cleanup(t, "testdata/static")
|
||||
s := staticCli(".", "testdata/static", 2, true)
|
||||
s := staticCli(".", "testdata/static", 2, "", false, false, true)
|
||||
assert.Equal(t, subcommands.ExitSuccess, s)
|
||||
// pages
|
||||
assert.FileExists(t, "testdata/static/index.html")
|
||||
@@ -34,7 +35,7 @@ And the cars so loud
|
||||
`)}
|
||||
h.save()
|
||||
h.notify()
|
||||
s := staticCli("testdata/static-feed", "testdata/static-feed-out", 2, true)
|
||||
s := staticCli("testdata/static-feed", "testdata/static-feed-out", 2, "", false, false, true)
|
||||
assert.Equal(t, subcommands.ExitSuccess, s)
|
||||
assert.FileExists(t, "testdata/static-feed-out/2024-03-07-poem.html")
|
||||
assert.FileExists(t, "testdata/static-feed-out/Haiku.html")
|
||||
|
||||
65
templates.go
65
templates.go
@@ -1,44 +1,80 @@
|
||||
package main
|
||||
|
||||
import (
|
||||
"embed"
|
||||
"html/template"
|
||||
"io/fs"
|
||||
"log"
|
||||
"net/http"
|
||||
"os"
|
||||
"path/filepath"
|
||||
"slices"
|
||||
"strings"
|
||||
"sync"
|
||||
)
|
||||
|
||||
// templateFiles are the various HTML template files used. These files must exist in the root directory for Oddmu to be
|
||||
// able to generate HTML output. This always requires a template.
|
||||
var templateFiles = []string{"edit.html", "add.html", "view.html", "preview.html",
|
||||
"diff.html", "search.html", "static.html", "upload.html", "feed.html",
|
||||
"list.html" }
|
||||
// A filesystem with a read-only copy of the templates at build time.
|
||||
//
|
||||
//go:embed *.html
|
||||
var templateDefaults embed.FS
|
||||
|
||||
// templateStore controls access to map of parsed HTML templates. Make sure to lock and unlock as appropriate. See
|
||||
// renderTemplate and loadTemplates.
|
||||
// templateFiles are the various HTML template files used. These files must exist in the root directory for Oddmu
|
||||
// to be able to generate HTML output. This always requires a template.
|
||||
var templateFiles = []string{}
|
||||
|
||||
// templateStore controls access to the map of parsed HTML templates. Make sure to lock and unlock as appropriate.
|
||||
// See renderTemplate and loadTemplates.
|
||||
type templateStore struct {
|
||||
sync.RWMutex
|
||||
|
||||
// template is a map of parsed HTML templates. The key is their filepath name. By default, the map only contains
|
||||
// top-level templates like "view.html". Subdirectories may contain their own templates which override the
|
||||
// templates in the root directory. If so, they are filepaths like "dir/view.html".
|
||||
// template is a map of parsed HTML templates. The key is their filepath name. By default, the map only
|
||||
// contains top-level templates like "view.html". Subdirectories may contain their own templates which
|
||||
// override the templates in the root directory. If so, they are filepaths like "dir/view.html". This is a
|
||||
// map because we need to add and remove templates as time passes.
|
||||
template map[string]*template.Template
|
||||
}
|
||||
|
||||
var templates templateStore
|
||||
|
||||
// loadTemplates loads the templates. If templates have already been loaded, return immediately.
|
||||
func loadTemplates() {
|
||||
// initTemplates loads the templates and writes them to disk if they are missing. If templates have already been
|
||||
// loaded, return immediately.
|
||||
func initTemplates() {
|
||||
if templates.template != nil {
|
||||
return
|
||||
}
|
||||
templates.Lock()
|
||||
defer templates.Unlock()
|
||||
// walk the directory, load templates and add directories
|
||||
templates.template = make(map[string]*template.Template)
|
||||
// load the defaults and make a list of the default names
|
||||
entries, err := templateDefaults.ReadDir(".")
|
||||
if err != nil {
|
||||
log.Println("An error in the build resulted in unreadable default templates:", err)
|
||||
}
|
||||
// if loading or parsing fails, continue as perhaps the files exist in the file-system
|
||||
for _, entry := range entries {
|
||||
name := entry.Name()
|
||||
templateFiles = append(templateFiles, name)
|
||||
b, err := fs.ReadFile(templateDefaults, name)
|
||||
if err != nil {
|
||||
log.Printf("Cannot read built-in default template %s: %s\n", name, err)
|
||||
continue
|
||||
}
|
||||
t := template.New(name)
|
||||
templates.template[name], err = t.Parse(string(b))
|
||||
if err != nil {
|
||||
log.Printf("Cannot parse built-in default template %s: %s\n", name, err)
|
||||
}
|
||||
_, err = os.Stat(name)
|
||||
if err != nil {
|
||||
err = os.WriteFile(name, b, 0644)
|
||||
if err == nil {
|
||||
log.Printf("Wrote built-in default template %s\n", name)
|
||||
} else {
|
||||
log.Printf("Cannot write built-in default template %s: %s\n", name, err)
|
||||
}
|
||||
}
|
||||
}
|
||||
// walk the directory, load templates and add directories
|
||||
filepath.Walk(".", loadTemplate)
|
||||
log.Println(len(templates.template), "templates loaded")
|
||||
}
|
||||
@@ -54,7 +90,6 @@ func loadTemplate(fp string, info fs.FileInfo, err error) error {
|
||||
t, err := template.ParseFiles(fp)
|
||||
if err != nil {
|
||||
log.Println("Cannot parse template:", fp, err)
|
||||
// ignore error
|
||||
} else {
|
||||
templates.template[fp] = t
|
||||
}
|
||||
@@ -92,7 +127,7 @@ func removeTemplate(fp string) {
|
||||
// renderTemplate is the helper that is used to render the templates with data.
|
||||
// A template in the same directory is preferred, if it exists.
|
||||
func renderTemplate(w http.ResponseWriter, dir, tmpl string, data any) {
|
||||
loadTemplates()
|
||||
initTemplates()
|
||||
base := tmpl + ".html"
|
||||
templates.RLock()
|
||||
defer templates.RUnlock()
|
||||
|
||||
@@ -2,10 +2,11 @@ package main
|
||||
|
||||
import (
|
||||
"bytes"
|
||||
"github.com/stretchr/testify/assert"
|
||||
"mime/multipart"
|
||||
"net/http"
|
||||
"testing"
|
||||
|
||||
"github.com/stretchr/testify/assert"
|
||||
)
|
||||
|
||||
func TestTemplates(t *testing.T) {
|
||||
@@ -21,7 +22,7 @@ Memories of cold
|
||||
assert.Contains(t,
|
||||
assert.HTTPBody(makeHandler(viewHandler, false, http.MethodGet), "GET", "/view/testdata/templates/snow", nil), "Skip")
|
||||
// save a new view handler
|
||||
html := "<body><h1>{{.Title}}</h1>{{.Html}}"
|
||||
html := "<body><h1>{{.Title}}</h1>{{.HTML}}"
|
||||
form := new(bytes.Buffer)
|
||||
writer := multipart.NewWriter(form)
|
||||
field, err := writer.CreateFormField("filename")
|
||||
|
||||
@@ -17,7 +17,7 @@
|
||||
<title>{{.Title}}</title>
|
||||
<link>https://alexschroeder.ch/view/{{.Path}}</link>
|
||||
<guid>https://alexschroeder.ch/view/{{.Path}}</guid>
|
||||
<description>{{.Html}}</description>
|
||||
<description>{{.HTML}}</description>
|
||||
<pubDate>{{.Date}}</pubDate>
|
||||
{{range .Hashtags}}
|
||||
<category>{{.}}</category>
|
||||
|
||||
@@ -48,9 +48,9 @@ button { background-color: #eee; color: inherit; border-radius: 4px; border-widt
|
||||
<article lang="{{.Language}}">
|
||||
<p><a class="result" href="/view/{{.Path}}">{{.Title}}</a>
|
||||
<span class="score">{{.Score}}</span></p>
|
||||
<blockquote>{{.Html}}</blockquote>
|
||||
<blockquote>{{.HTML}}</blockquote>
|
||||
{{range .Images}}
|
||||
<p class="image"><a href="/view/{{.Path}}"><img loading="lazy" src="/view/{{.Path}}"></a><br/>{{.Html}}
|
||||
<p class="image"><a href="/view/{{.Path}}"><img loading="lazy" src="/view/{{.Path}}"></a><br/>{{.HTML}}
|
||||
{{end}}
|
||||
</article>
|
||||
{{end}}
|
||||
|
||||
@@ -29,7 +29,7 @@ img { max-width: 100% }
|
||||
<body>
|
||||
<main id="main">
|
||||
<h1>{{.Title}}</h1>
|
||||
{{.Html}}
|
||||
{{.HTML}}
|
||||
</main>
|
||||
<footer>
|
||||
<address>
|
||||
|
||||
@@ -54,7 +54,7 @@ img { max-width: 100% }
|
||||
</header>
|
||||
<main id="main">
|
||||
<h1>{{.Title}}</h1>
|
||||
{{.Html}}
|
||||
{{.HTML}}
|
||||
</main>
|
||||
<footer>
|
||||
<address>
|
||||
|
||||
@@ -16,7 +16,7 @@
|
||||
<title>{{.Title}}</title>
|
||||
<link>https://campaignwiki.org/view/{{.Path}}</link>
|
||||
<guid>https://campaignwiki.org/view/{{.Path}}</guid>
|
||||
<description>{{.Html}}</description>
|
||||
<description>{{.HTML}}</description>
|
||||
<pubDate>{{.Date}}</pubDate>
|
||||
{{range .Hashtags}}
|
||||
<category>{{.}}</category>
|
||||
|
||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user