Compare commits

..

215 Commits

Author SHA1 Message Date
Alex Schroeder
4b33b3afeb Get rid of -r 2016-06-27 12:05:37 +02:00
Alex Schroeder
9beca5895a tarballs.pl: decode utf8 2016-06-24 12:29:19 +02:00
Alex Schroeder
1afc03eee1 tarballs.pl: a frontend to serve released files
There is a target in our Makefile to make a new release. This stores a
tarball with the appropriate release information in
https://oddmuse.org/releases. tarballs.pl offers an interface to serve
these files, or their individual member files, with a naive cache of
50 elements.

This is a Mojolicious application and is available here:
https://odddmuse.org/download
2016-06-23 23:41:41 +02:00
Alex Schroeder
331b03f894 Script to serve tarballs 2016-06-23 18:33:42 +02:00
Alex Schroeder
1c9b180b3a Merge git.sv.gnu.org:/srv/git/oddmuse 2016-06-23 00:47:10 +02:00
Alex Schroeder
57a16e85f8 meta.t: improve by skipping comments 2016-06-23 00:44:06 +02:00
Alex Schroeder
c7cd5bcc36 meta.t: improve by skipping comments 2016-06-23 00:38:23 +02:00
Alex Schroeder
f571007516 Fix issues discovered by meta.t 2016-06-23 00:34:56 +02:00
Alex Schroeder
fac3f03f7b meta.t: enforce file access rules 2016-06-23 00:31:52 +02:00
Alex Schroeder
7d85dd6570 toc.pl: use ToString and don't double-decode
ToString now takes more arguments.
2016-06-22 16:24:07 +02:00
Alex Schroeder
a91ef8602f Moving modules from utf8::encode to encode_utf8 2016-06-22 15:37:04 +02:00
Alex Schroeder
1bc670617e test.pl: move to encode_utf8 as well 2016-06-22 14:54:52 +02:00
Alex Schroeder
74288ba3f3 Moving from utf8::encode to encode_utf8 2016-06-22 14:43:28 +02:00
Alex Schroeder
bf2856011d Changing $foo/$bar to "$foo/$bar"
Why did this not cause a syntax error?
2016-06-19 15:56:34 +02:00
Alex Schroeder
ca974a902d latex.pl: Globbing nil requires double quotes to work 2016-06-19 15:55:46 +02:00
Alex Schroeder
f992897e7a opendir also requires bytes 2016-06-19 15:55:03 +02:00
Alex Schroeder
c121607f61 All open and tie calls need utf8::encode 2016-06-19 13:51:11 +02:00
Alex Schroeder
032c7aea73 bsd_glob replaced with Glob 2016-06-19 11:55:58 +02:00
Alex Schroeder
f6c419746c tags.pl: Storable also needs bytes in filename 2016-06-19 00:10:39 +02:00
Alex Schroeder
83f13a9a1a Use helper functions for accessing the file system
As we derive a lot of filenames from strings in UTF-8 encoded files, we
need to make sure that any filename that might might be set by a user –
including all the filenames containing a directory deriving from
$DataDir – are passed through utf8::encode. That is, every character
gets replaced with a sequence of one or more characters that represent
the individual bytes of the character and the UTF8 flag is turned off.

In other words, -d $DataDir might not work if $DataDir contains a UTF-8
encoded string. The solution is to use the following replacements:

-f $name            IsFile($name)
-e $name            IsFile($name)
-d $name            IsDir($name)
(stat($name))[9]    Modified($name)
-M $name            $Now - Modified($name)
-z $name            ZeroSize($name)
unlink $name        Unlink($name)
mkdir $name         CreateDir($name)
rmdir $name         RemoveDir($name)

(Using IsFile for -e is probably not ideal?)

If you don’t, and Oddmuse gets used with Mojolicious, and you use the
Namespaces Extension, and a namespace contains non-ASCII characters such
as ä, ö, or ü, these characters will end up as part of $DataDir and
trigger the problem.

I also wonder whether we should be using some other Perl library.
2016-06-17 14:49:34 +02:00
Alex Schroeder
2111af2ec8 Fix regular expression in Makefile
Unescaped left brace in regex is illegal in regex.
2016-06-16 09:43:08 +02:00
Alex Schroeder
648e6eb9bc Skip pygmentize if the binary is not found 2016-06-15 15:07:15 +02:00
Alex Schroeder
994b4e8051 Tests rely on English output
Set environment variable to en_US.UTF-8.
2016-06-15 14:47:20 +02:00
Alex Schroeder
f2f464b1ca test.pl: no warning about killing the server 2016-06-15 10:32:58 +02:00
Alex Schroeder
119d11b405 Merge branch 'master' of github.com:kensanata/oddmuse 2016-06-15 10:31:06 +02:00
Alex Schroeder
d7031198cd Merge github.com:kensanata/oddmuse 2016-06-14 13:14:12 +02:00
Alex Schroeder
187d4020f5 Make server.pl compatible with Alexine 2016-06-14 13:07:29 +02:00
Alex Schroeder
0a77bd0b47 All access to the file system needs bytes!
All occurence of	tuns into
-f $name		IsFile($name)
-e $name		IsFile($name)
-d $name		IsDir($name)
(stat($name))[9]	Modified($name)
-M $name		$Now - Modified($name)
unlink $name		Unlink($name)
mkdir $name		CreateDir($name)
rmdir $name		RemoveDir($name)

This change is incomplete. All the modules also need to be changed.
The benefit of this change is that t/mojolicious-namespaces.t passes.
2016-06-13 22:28:52 +02:00
Alex Schroeder
cb00e7e969 Prevent warning by using 127.0.0.1
Using localhost leads to a warning on my Debian Wheezy system.
2016-06-12 22:29:49 +02:00
Alex Schroeder
1b2fe0d713 A test for Mojolicious + Namespaces
Currently this fails if the namespace contains non-ASCII characters.
This failing test has been wrapped in a TODO.
2016-06-12 22:24:23 +02:00
Alex Schroeder
8e73f6f0dd Use /wiki in $ScriptName for Mojolicious test 2016-06-12 21:26:18 +02:00
Alex Schroeder
d3c7b45ad9 Simplify Mojolicious server setup
Make sure the wiki log is written into the data directory.
2016-06-12 21:25:28 +02:00
Alex Schroeder
fee15fd880 Simplify namespaces.pl initialization 2016-06-12 21:24:43 +02:00
Alex Schroeder
196b960b47 Add test for Mojolicious server 2016-06-12 17:32:20 +02:00
Alex Schroeder
241a88ef48 Move server starting code from atom.t to test.pl 2016-06-11 20:08:04 +02:00
Alex Schroeder
fdf0c2711b Added a test for stuff/server.pl 2016-06-11 14:15:36 +02:00
Alex Schroeder
239a95e683 Fix typo: substring → substr 2016-06-05 18:00:01 +02:00
Alex Schroeder
d3205d2425 INS is italic, not red 2016-05-28 21:09:55 +02:00
Alex Schroeder
681ba8068c Add Spanish national days 2016-05-28 13:58:57 +02:00
Alex Schroeder
d5429d276f load-lang.pl: Fix code and add test
If you're accept-language settings included something for which no file
exists (such as "de-ch"), then the test for the file would
succeed (testing for the existence of "$LoadLanguageDir/") even though
no actual file will be loaded. This is fixed. Also: implemented actual
testing.
2016-05-28 09:30:59 +02:00
Alex Schroeder
746b10be81 Some fixes for Oddmuse Mode (Emacs) 2016-05-28 09:23:36 +02:00
Alex Schroeder
b9aa27e406 How to replace all your pictures from Flickr with local copies 2016-05-13 17:11:48 +02:00
Alex Schroeder
00cf277156 How to replace all your pictures from Flickr with local copies 2016-05-13 17:10:28 +02:00
Alex Schroeder
75ce7d745e expire-bans.pl is new 2016-03-03 15:04:48 +01:00
Alex Schroeder
ee1bbca5c9 Don't use both $wiki and $1 2016-02-16 17:39:24 +01:00
Alex Schroeder
0effc86620 added delete.sh
I've been using this to clear out wikis on campaignwiki.org
2016-02-16 16:42:37 +01:00
Aleks-Daniel Jakimenko-Aleksejev
8a36970b24 static-copy.pl: Fix non-ascii links
It seems like it is operating on url-encoded strings, so all we have to
do is to decode it. This, however, does not mean that we should print
decoded strings. ‘href’ attribute still has to be encoded (or so it seems).
2016-02-02 20:22:06 +02:00
Alex Schroeder
8be87ede99 light.css: change summory not bold 2016-01-29 14:16:11 +01:00
Alex Schroeder
d61dd71627 New CSS für alexschroeder.ch 2016-01-17 22:29:51 +01:00
Alex Schroeder
755f742088 sidebar.t: writing more tests
Trying to find the problem I have on one of the sites.
2016-01-02 12:21:54 +01:00
Aleks-Daniel Jakimenko-Aleksejev
0107e41123 load-lang.pl: Add Estonian translation 2015-12-30 15:45:41 +02:00
Ain Laidoja
e58c8c2192 Estonian translation 2015-12-30 15:11:57 +02:00
Alex Schroeder
ee4518da9e release: clean up and checkout master 2015-12-19 12:01:04 +01:00
Alex Schroeder
64e7183896 stuff/release is new 2015-12-19 11:54:19 +01:00
Alex Schroeder
bd2715a35e DoSearch: Add link to actually DO the replacement 2015-12-16 18:19:33 +01:00
Alex Schroeder
986e4fc65f Replace: no pagination
The old code would only call the the function provided
if pagination indicated that it was OK to do so. Thus,
only the first ten pages got replaced. This has been
fixed and tests have been added.
2015-12-16 15:50:35 +01:00
Alex Schroeder
908cecffb9 atom.t: fix warning 2015-12-16 10:04:56 +01:00
IngoBelka
07226ae7a1 typo 2015-11-29 10:25:15 +01:00
IngoBelka
c90258ef4b supplementing the German transltion 2015-11-29 10:10:47 +01:00
Aleks-Daniel Jakimenko-Aleksejev
f500092a6a Alexine IRC bot 2015-11-19 23:25:18 +02:00
Alex Schroeder
d1f1f65c9b green.css: no dashed line between sister sites 2015-11-18 09:58:20 +01:00
Alex Schroeder
aae0cb6379 light.css: no bold summary for recent changes 2015-11-15 21:58:58 +01:00
Alex Schroeder
2b0a0d9a14 word-count.pl: fix regular expression for dates 2015-11-15 21:58:58 +01:00
Aleks-Daniel Jakimenko-Aleksejev
69fcb9646b Fix tests for default stylesheet link
Also, get rid of redundant “www”.
2015-11-03 03:21:27 +02:00
Aleks-Daniel Jakimenko-Aleksejev
a0bf615960 Alexine: some successful commits are major
If previous commit had some failing tests then Alexine will announce that on
the wiki. When the problem is solved we probably don't want to see scary
messages in Recent Changes, so next successful commit should be announced (it
should not be a minor edit).
2015-11-03 03:03:13 +02:00
Aleks-Daniel Jakimenko-Aleksejev
c024f553fd Default css should be retrieved over secure connection
Everything on oddmuse.org now redirects to https, which means that every wiki
that is using default style sheet requires two requests to get the css file.
2015-11-02 20:57:49 +02:00
Aleks-Daniel Jakimenko-Aleksejev
c97d6a576f wiki.css: We are no longer using these fonts
There is no need to define fonts that we do not use anyway.
2015-11-02 20:50:52 +02:00
Alex Schroeder
c64095fd95 scripts: renamed two scripts 2015-10-31 15:15:36 +01:00
Aleks-Daniel Jakimenko-Aleksejev
542f552002 nosearch.pl: code style
It seems like oddtrans does not pick up the strings with double quotes?
It's weird, but it does not matter anyway, because we will switch to gettext
sooner or later.
2015-10-31 02:58:00 +02:00
Aleks-Daniel Jakimenko-Aleksejev
22017a24f2 Updates to Russian translation (70% → 85%) 2015-10-31 02:56:39 +02:00
Alex Schroeder
6ac7093e9f word-count.pl: new 2015-10-30 23:39:52 +01:00
Aleks-Daniel Jakimenko-Aleksejev
89a23a6ac5 Full support for arrayref in $StyleSheet 2015-10-26 01:03:42 +02:00
Aleks-Daniel Jakimenko-Aleksejev
ed17476aeb Fix the number of tests in css.t 2015-10-26 00:34:57 +02:00
Aleks-Daniel Jakimenko-Aleksejev
1b951c66f1 Allow multiple stylesheet files in $StyleSheet
Since $StyleSheet is a scalar, you can't pass multiple values, but you can
now set it to one array ref. For example:

$StyleSheet = ['http://example.org/test.css', 'http://example.org/another.css'];
2015-10-26 00:27:14 +02:00
Aleks-Daniel Jakimenko-Aleksejev
878d99a84c New script translations-stats
Basically copied from the README file. Now you can use it instead of pasting
very long lines into your terminal.
2015-10-23 00:55:21 +03:00
Aleks-Daniel Jakimenko-Aleksejev
c5ec3d782c oddtrans: print # on the last line
Otherwise we will get odd number of elements if the last string has no
translation (it seems like perl trims last empty lines).

Also: strictures and formatting
2015-10-23 00:25:38 +03:00
Aleks-Daniel Jakimenko-Aleksejev
a28276b868 More fixes for Spanish translation 2015-10-21 16:57:27 +03:00
Aleks-Daniel Jakimenko-Aleksejev
a8920bfec1 Some fixes for Spanish translation 2015-10-21 16:06:13 +03:00
Aleks-Daniel Jakimenko-Aleksejev
0e5f338b40 Prioritize slow tests
By using 「--state=slow,save」 we can probably crunch all tests faster (better
wallclock time). Some tests are taking a lot of time simply because of the
delays (sleeping), so it makes sense to start these tests earlier.
2015-10-21 13:15:34 +03:00
Aleks-Daniel Jakimenko-Aleksejev
4a7e50e83e Fix tests ($ShowAll)
$ShowAll was not added to 「our」.
2015-10-21 12:15:25 +03:00
Aleks-Daniel Jakimenko-Aleksejev
608440553b Alexine repo is different now 2015-10-21 06:00:57 +03:00
Aleks-Daniel Jakimenko-Aleksejev
e493652e96 Alexine updates according to the new directory structure
Also, use 8 threads for 「make test」
2015-10-21 05:39:05 +03:00
Aleks-Daniel Jakimenko-Aleksejev
d286267d52 translations/README deleted
This file has to be regenerated periodically, Alexine will do that.
Also, these long one-liners have to be separated into scripts.
I am deleting this file because I'm not willing to update it.
2015-10-21 05:19:56 +03:00
Alex Schroeder
27c5c5fa79 Translation: fix 'non existent' for Spanish, too 2015-10-20 17:31:42 +02:00
Alex Schroeder
9f7cd0bfc7 Translation: 'non existant' to 'nonexisting' 2015-10-20 17:19:02 +02:00
Matias A. Fonzo
6a45d51189 Updates to Spanish translation
This is a general revision of the current file, and a massive upgrade
from the last date (2002).  Some sentences have been formalized and
have been clarified, including corrections, completing the total of the
translation.
2015-10-20 16:51:28 +02:00
Aleks-Daniel Jakimenko-Aleksejev
5d99cb5874 Typo in German translation (liefert repeated twice) 2015-10-19 23:21:44 +03:00
Aleks-Daniel Jakimenko-Aleksejev
7f0f8164bd Biscecting -> Bisection (typo) 2015-10-19 23:17:39 +03:00
Aleks-Daniel Jakimenko-Aleksejev
236e6a4c85 Updates to Russian translation 2015-10-19 23:12:58 +03:00
Aleks-Daniel Jakimenko-Aleksejev
669043e7a9 More fixes for $ShowAll
Some modules had to be fixed too!
'showedit' and 'rollback' are not used in modules, so there's nothing to fix.
2015-10-18 03:14:57 +03:00
Aleks-Daniel Jakimenko-Aleksejev
e57372692e “link blow” → “link below” 2015-10-18 02:48:06 +03:00
Aleks-Daniel Jakimenko-Aleksejev
512afd75a0 Fixed $ShowAll and $ShowEdits, added $ShowRollbacks
It seems like $ShowEdits feature was half broken (not all occurances were
actually defaulted to its value). In the last commit I did the same mistake
with $ShowAll. This is now fixed.
Also, for completeness, I decided to add $ShowRollbacks as well.
2015-10-18 01:21:07 +03:00
Aleks-Daniel Jakimenko-Aleksejev
b5c51d19ba New $ShowAll variable
Sometimes you might want to “List all changes” and “Include minor
changes” by default. We can already change the default value of the
latter by using $ShowEdits variable, but another one was unsettable from
the config file. Now we have $ShowAll variable.
2015-10-18 00:49:57 +03:00
Alex Schroeder
0955dcbc97 make translations
This adds the subheaders to all the translation files.

for f in modules/translations/*-utf8.pl; do
  perl -e "sub AddModuleDescription { print shift, ' ' };
           do '$f';
	   \$i = 0;
	   map { \$_ || \$i++} values %Translate;
	   printf(qq{%d/%d translations missing\n}, \$i, scalar keys %Translate);";
done

brazilian-portuguese-utf8.pl 237/672 translations missing
bulgarian-utf8.pl 496/672 translations missing
catalan-utf8.pl 259/672 translations missing
chinese-utf8.pl 254/672 translations missing
chinese_cn-utf8.pl 199/672 translations missing
dutch-utf8.pl 459/672 translations missing
finnish-utf8.pl 436/672 translations missing
french-utf8.pl 177/672 translations missing
german-utf8.pl 13/672 translations missing
greek-utf8.pl 242/672 translations missing
hebrew-utf8.pl 555/672 translations missing
italian-utf8.pl 425/672 translations missing
japanese-utf8.pl 5/237 translations missing
korean-utf8.pl 393/672 translations missing
fixme-utf8.pl 672/672 translations missing
polish-utf8.pl 239/672 translations missing
portuguese-utf8.pl 389/672 translations missing
romanian-utf8.pl 514/672 translations missing
russian-utf8.pl 297/672 translations missing
serbian-utf8.pl 526/672 translations missing
spanish-utf8.pl 246/672 translations missing
swedish-utf8.pl 372/672 translations missing
ukrainian-utf8.pl 347/672 translations missing
2015-10-17 23:00:26 +02:00
Alex Schroeder
5ee3adf13f Switched to Palatino
On a phone with little RAM, downloading Noticia takes too much time.
2015-10-16 08:44:17 +02:00
Alex Schroeder
5a237d05f7 Translation: "为 %s 建立锁定 。\t"
Removed trailing tab.
2015-10-15 20:12:19 +02:00
Alex Schroeder
bd5d419472 Translation: " . . . . "
Removed trailing whitespace.
2015-10-15 20:05:23 +02:00
Alex Schroeder
cdb66e1ed4 Translation: " ... " is no longer translatable 2015-10-15 19:35:08 +02:00
Alex Schroeder
d5cd6cbd65 Translations: removed trailing whitespace
Various translation files still had translated strings with trailing
whitespace where the English original did not have any. This has been
removed.
2015-10-15 19:32:12 +02:00
Alex Schroeder
e85ddcc9b9 Translation: "Comments on " and "Comment on"
Removed trailing whitespace.
2015-10-15 19:29:35 +02:00
Alex Schroeder
ac4948ca5d Translations: removed trailing whitespace
Various translation files still had translated strings with trailing
whitespace where the English original did not have any. This has been
removed.
2015-10-15 19:23:42 +02:00
Alex Schroeder
6bffdc8149 Trans.: "Consider banning the IP number as well: "
Remove trailing whitespace. In the German translation, I also replaced a
few instances of "sie" with "Sie".
2015-10-15 19:18:55 +02:00
Alex Schroeder
db814c627a Transl.: "Module count (only testable modules): "
Remove trailing whitespace.
2015-10-15 19:15:18 +02:00
Alex Schroeder
b156e08d85 Translation: "Summary of your changes: "
Remove trailing whitespace.
2015-10-15 19:13:13 +02:00
Alex Schroeder
cad08ee17c Translation: "Define external redirect: "
Removed trailing whitespace.
2015-10-15 19:10:59 +02:00
Alex Schroeder
7c04ee83e5 Translation: "Internal Page: "
This string was not used correctly. T('Internal Page: ' . $resolved)
means that the string is concatenated with $resolved and then it will be
translated. This means that the translation will never be found. The
correct usage is as follows: Ts('Internal Page: %s', $resolved). The
translation string has therefore been changed to 'Internal Page: %s' and
the translation files have been fixed accordingly.
2015-10-15 19:07:51 +02:00
Alex Schroeder
844d984526 Translation: ", see "
Removed trailing whitespace.
2015-10-15 15:01:51 +02:00
Alex Schroeder
e91797fcba Translation: "Name: " and "URL: "
Removed trailing whitespace.
2015-10-15 14:55:54 +02:00
Alex Schroeder
8ad1c60817 Translation: remove trailing whitespace
The German translation contained a stray trailing whitespace.
2015-10-15 14:47:49 +02:00
Alex Schroeder
e24f853bef Translation: "Email: "
Removed trailing whitespace.
2015-10-15 14:41:55 +02:00
Alex Schroeder
cfceb84cc6 Translation: "Trail: "
Removed trailing whitespace.
2015-10-15 14:36:51 +02:00
Alex Schroeder
e81234d81f Translation: "Translated page: "
Removed trailing whitespace.
2015-10-15 14:34:28 +02:00
Alex Schroeder
5a647e6042 Translation: "This page is a translation of %s. "
Removed trailing whitespace.
2015-10-15 14:27:17 +02:00
Alex Schroeder
a53f3e390f Translation: "Title: "
Removed trailing whitespace.
2015-10-15 14:24:08 +02:00
Alex Schroeder
d9d213b6b3 Translation: "Trags: "
Removed trailing whitespace.
2015-10-15 14:19:16 +02:00
Alex Schroeder
09c5351a11 Translation: fix Return to
This used to be string with a trailing whitespace, but actual use was
wrong: T('Return to ' . NormalToFree($id)) – this concatenates the page
name and then attempts to translate the result, which never works. The
correct usage is Ts('Return to %s', NormalToFree($id)).
2015-10-14 12:39:55 +02:00
Alex Schroeder
f0fc2f2f29 Translation: File to upload: without trailing SPC 2015-10-14 12:35:23 +02:00
Alex Schroeder
7d6138107f Translation: not deleted: without trailing space 2015-10-14 12:32:25 +02:00
Alex Schroeder
446f587a49 Translation: Cookie: no longer required
With or without trailing space, this text is no longer required.
2015-10-14 12:28:22 +02:00
Alex Schroeder
0872ee501e Translation: %s: without trailing space
We can still translate %s: to %s : for the French.
2015-10-14 12:24:31 +02:00
Alex Schroeder
e4f7500340 month-names, national-days: delete cruft
An error in the Makefile treated all *.pl files in the translations
directory as translation files -- including these other files which are
not regular translation files.
2015-10-14 12:23:18 +02:00
Alex Schroeder
a429bb6a4b alex-2015.css: switched fonts
Removed the @font-face rules that downloaded Noticia Text and Symbola
from the net. This was slowing down access from old mobile phones.
Instead, I'm now using a font-family of "Palatino Linotype", "Book
Antiqua", Palatino, serif.
2015-10-14 10:29:45 +02:00
Aleks-Daniel Jakimenko
ce2a39d8f1 Allow custom setting for --jobs in make test 2015-10-13 03:21:11 +03:00
Aleks-Daniel Jakimenko
81a4dbcdcd Alexine: more comfortable default paths 2015-10-13 02:26:00 +03:00
Alex Schroeder
33a3f515a3 big-brother.t: more rebust under heavy load
The previous fix was no good. Now attempting a different fix.
2015-10-12 15:45:09 +02:00
Alex Schroeder
6372907c4b Parallelize tests
Using random numbers to generate new test-directories for every test
file. Use t/setup.pl to reset.
2015-10-12 15:13:22 +02:00
Alex Schroeder
a8b7b67efe Use warnings
search.t: Using braces without escaping them in regular expressions
trigges a warning. wiki.pl will now quotemeta the replacement string
when highlighting changes.

upgrade-files.t: Only remove the old UseMod directory if it actually
exists in order to fix some warnings.

wiki.pl: only reading the log file when open actually succeeded in order
to fix some warnings.
2015-10-12 15:12:20 +02:00
Alex Schroeder
b0f9722857 recaptcha.t: use Captcha::reCAPTCHA always
This module is no longer optional. The test will not skip.
2015-10-12 15:09:15 +02:00
Alex Schroeder
d5fda299b0 Some tests now more rebust under heavy load
When running tests with four jobs on a laptop with just two cores, load
is heavy and some tests may fail. Trying to make them more robust...

- big-brother.t
- captcha.t
2015-10-12 15:08:08 +02:00
Alex Schroeder
725e121731 meta.t: perl -c only takes one file 2015-10-12 14:48:51 +02:00
Alex Schroeder
ad299b6b1d server.pl: instructions for the plugin to use
I've submitted some patches for Mojolicious::Plugin::CGI and the
comments now point to a fork.
2015-10-11 20:02:37 +02:00
Aleks-Daniel Jakimenko
8439566b01 aawrapperdiv.pl deleted 2015-10-11 12:19:29 +03:00
Alex Schroeder
ef4263cf03 Adding wiki.log to .gitignore 2015-10-09 00:48:07 +02:00
Alex Schroeder
464a6e9af1 server.pl is a Mojolicious server for Oddmuse 2015-10-09 00:47:31 +02:00
Aleks-Daniel Jakimenko
7152fa0a54 WikiConfigFile and WikiModuleDir ENV variables
Currently the config file and modules are supposed to be in $DataDir,
which does make any sense from security point of view. Files with code
should not be in directories that are writable by www-data.

Previously you had to use a wrapper script to work around that. Now we
provide special variables.

Please note that oddmuse will sometimes cache data by using Storable.
Such cache is saved to the disk and then read back when required. This,
however, is an insecure operation given that there is a risk that the
file will be manipulated from www-data in a malicious way.
2015-10-06 04:53:21 +03:00
Alex Schroeder
da5c5a8275 New make target: new-utf8.pl
Changes to oddtrans make sure that lines starting with # are comments
and will be stripped when the translation file is read. When writing new
translation files, comments are added to indicate which files are being
processed right now. This will help translators figure out where the
texts originated from. Note that every key appears only once, so
translations will be missing in the section for later files if they
appeared in earlier sections.

Recreated new-utf8.pl in order to illustrate the new format.
2015-09-30 17:35:02 +02:00
Aleks-Daniel Jakimenko
ee52a25ebf ban.t: fix number of tests 2015-09-30 16:27:40 +03:00
Aleks-Daniel Jakimenko
e831c10cd3 ban.t: removing more strange-spam tests 2015-09-30 16:00:30 +03:00
Aleks-Daniel Jakimenko
9a6da39aaf strange-spam.pl module was deleted, deleting tests as well 2015-09-28 11:24:58 +03:00
Aleks-Daniel Jakimenko
99b819dd68 strange-spam.pl: Module deleted
The wiki says that this module is obsolete. If it is, then there is no need to
keep it in our repo.
2015-09-28 11:21:18 +03:00
Aleks-Daniel Jakimenko
0b42ed0508 private-wiki.pl: missing our 2015-09-23 21:30:54 +03:00
Aleks-Daniel Jakimenko
3b6d891dc7 Stop leaving locks behind
Previously if some user cancelled his request (simply by pressing Stop button
in his browser), then the script will receive a TERM signal or the like.

This means that some locks could be left behind, which required someone
to unlock the wiki manually (by using Unlock Wiki action).

Now we remove these locks automatically!

However, some tasks might want to handle such situations gracefully. That's why
%LockCleaners hash was added. Use lock name as the key and put a coderef as a
value. If SIGTERM (or a bunch of other signals) is received, then it will run
this code, which, supposedly, cleans all of the stuff after it. Private Wiki
Extension was changed according to that, so you can see it in action.

Also, tests added!
2015-09-23 21:07:02 +03:00
Alex Schroeder
e77abbc09f big-brother.pl: Make sure restricted URL is used 2015-09-21 17:56:02 +02:00
Alex Schroeder
61a2238d9a Rollback fixes for $KeepDays = 0
RollbackPossible needs to handle the situation where $KeepDays == 0.
DoRollback used to examine all the pages where rollback was
possible (using $KeepDays) but in order to avoid the special case where
$KeepDays == 0, we can also examine all the pages changed after the
target timestamp $to.
2015-09-21 09:58:38 +02:00
Alex Schroeder
0322deaf82 maintain.t: Fix for $KeepDays = 0 2015-09-21 09:42:30 +02:00
Alex Schroeder
044a6ad835 preview.pl: remove trailing whitespace 2015-09-21 09:41:00 +02:00
Alex Schroeder
86fe0193b7 preview.pl: fix typo
'$id' is not interpolated...
2015-09-21 09:39:40 +02:00
Alex Schroeder
2517928c1e conflict.t: typo 2015-09-21 09:34:00 +02:00
Alex Schroeder
971f4b1579 history.t: Fix for $KeepDays = 0 2015-09-21 09:27:48 +02:00
Alex Schroeder
c0f0b970a6 preview.pl: Add admin menu 2015-09-21 09:24:53 +02:00
Alex Schroeder
fa493d7360 preview.pl: New module to report changes in HTML 2015-09-21 09:12:32 +02:00
Alex Schroeder
831de74800 Print cache even if just have a single clean block
Once again, bitten by Perl. We used to print a cache "if ($Page{blocks}
and $Page{flags} and GetParam('cache', $UseCache) > 0)" -- but if we
have exactly one clean block, then flags will be "0" which is false. So
now we're testing for defined $Page{flags}.
2015-09-21 09:09:14 +02:00
Alex Schroeder
755010f619 rollback.t: Fix tests given $KeepDays default
As $KeepDays now defaults to 0, more changes are required to make these
tests work again.
2015-09-20 14:03:18 +02:00
Alex Schroeder
44c7102dd5 rollback.t: fix many tests
This is done by appending "$KeepDays = 14" to the config file whenever
it gets written. This restores the old behaviour and thus "fixes" the
tests.
2015-09-20 11:44:37 +02:00
Aleks-Daniel Jakimenko
e731c16214 Fix crossbar.t after 858ff72
It seems like the test needs any empty page (not exactly HomePage,
which now returns "Welcome!").
2015-09-19 23:02:29 +03:00
Aleks-Daniel Jakimenko
b3b6eeb2bd Explanation for $KeepDays = 0 (0 means forever) 2015-09-19 22:40:12 +03:00
Aleks-Daniel Jakimenko
fc8c0e66a7 Let's have a working page history (no more ForgiveAndForget by default)
There was a huge discussion with a lot of tension:
https://oddmuse.org/wiki/Revolutionary_Changes#ForgiveAndForget
And also the comments:
https://oddmuse.org/wiki/Comments_on_Revolutionary_Changes#ForgiveAndForget

But in the end, it is safer to have a history which is not broken.
Don't get it wrong, ForgiveAndForget is still a good thing, it's just not what
we should do *by default*.

If your wiki does benefit from ForgiveAndForget, then add this to your config:
$KeepDays = 14;

Although this change solves a couple of important problems, it does not address
new ones that arise because of no ForgiveAndForget. Namely it does not
resolve the problem of deleting stuff when you *really* have to do it. For
example, [[DMCA Extension]] (or similarly named extension with the same
purpose) should be developed. These problems existed for a long time, because
people were using “$KeepDays = 0” a lot. It is just that we started to accept
wikis with no ForgiveAndForget more thoroughly.

In other words, this commit is just part of the bigger change.

Why don't we set it to 5 years? Because then it will be a time bomb that will
be triggered unexpectedly. We should have a more predictable default value.
2015-09-19 22:18:15 +03:00
Alex Schroeder
6207434f19 Make sure the "Welcome!" message is shown 2015-09-19 18:30:51 +02:00
Aleks-Daniel Jakimenko
9ecfe306cb load-lang.pl: missing translations, meta test 2015-09-19 08:28:05 +03:00
Aleks-Daniel Jakimenko
a4dd2b8b0a load-lang.pl: Use $ModuleDir/translations by default
Modules are not loaded recursively, so we are free to use any directory inside
$ModuleDir. It is also where translations are located in the git repo.

Also, %library was renamed to %TranslationsLibrary (which is now "our"). This
is required for tests and for custom configuration.
2015-09-19 07:57:33 +03:00
Aleks-Daniel Jakimenko
0868f3a98e Workaround for utf8::decode bug (sometimes utf8 chars were not decoded)
Remember the problem with toc.pl when the whole page was *sometimes* not
utf8-decoded? There were some thoughts that it might be associated with
memory files, and it is correct. Although I was not able to narrow it down
last time, now I did (simply because this problem appeared elsewhere).

If you look at $output variable after utf8::decode with Devel::Peek, you
will see two variants of flags. This one looks good:
   FLAGS = (PADMY,POK,pPOK,UTF8)
And this one is wrong:
   FLAGS = (PADMY,POK,pPOK)
This problem is weird because it works inconsistently. Most of the time
you will get correct output, but sometimes it will be broken.

Someone has to golf it down to something short in order to submit perl
bug report. This, however, does not look like a simple task.

Current workaround is as stupid as it looks like, but it works.
Somehow assigning it to another variable solves the problem (which, by the
way, is similar to solving other perl string-related problems).
2015-09-16 04:13:02 +03:00
Aleks-Daniel Jakimenko
316471b145 server.pl: just whitespace 2015-09-15 04:13:24 +03:00
Alex Schroeder
586972c71d atom.t: Use our own server for this test
We no longer require an existing webserver running a wiki at
http://localhost/cgi-bin/wiki.pl. Instead, we're running our own
stuff/server.pl on a random port and use it for testing -- and we kill
it when we're done.
2015-09-14 13:01:22 +02:00
Alex Schroeder
f6954c4a2e server.pl simplified and with license 2015-09-14 12:40:09 +02:00
Alex Schroeder
3b0d8c9bd6 server.pl: a stand-alone Oddmuse wiki server 2015-09-13 22:40:05 +02:00
Alex Schroeder
3e60aa8e1b atom.pl: use PUTDATA 2015-09-12 19:06:59 +02:00
Alex Schroeder
b3f865a4ab Make Atom tests mandatory
You must have a wiki running at http://localhost/cgi-bin/wiki.pl and it
must have the Atom Extension installed.
2015-09-12 19:06:09 +02:00
Alex Schroeder
64568025c9 Don't skip the tests if XML::Atom is missing 2015-09-12 18:42:27 +02:00
Alex Schroeder
9472a279ea Lots of tests for preview pagination 2015-09-12 00:05:30 +02:00
Alex Schroeder
e0fdeffc94 Link license 2015-09-11 22:18:38 +02:00
Aleks-Daniel Jakimenko
d1d70be583 No more trailing whitespace (again!), meta test added 2015-09-11 18:07:52 +03:00
Alex Schroeder
6260033669 Add pagination to search and replace preview 2015-09-11 15:15:00 +02:00
Alex Schroeder
57a4132512 Fix test for malformed regular expression search 2015-09-11 15:14:34 +02:00
Alex Schroeder
3e16b45dbb Fix ReplaceAndDiff calling convention
ReplaceAndDiff calls Replace, which loops over all pages. That's why we
don't need to call it from SearchTitleAndBody -- that makes our code
runs way too often.
2015-09-11 14:34:01 +02:00
Alex Schroeder
21392f2f1b Fix issue with malformed regular expressions
If the regular expression cannot be compiled using eval { qr/$re/ } we
just use quotemeta($re) instead.
2015-09-11 14:12:29 +02:00
Alex Schroeder
0fd86ee60d Preview button for search and replace
Also, more use of $func->() instead of &$func() syntax.
2015-09-11 14:05:35 +02:00
Aleks-Daniel Jakimenko
fd42ebf9c3 With /x, # has a special meaning (escape it!) 2015-09-11 02:55:18 +03:00
Aleks-Daniel Jakimenko
3180e5b02a smarttitles.pl: allow patterns in #SUBURL
(with a colon) to get interlinks interpreted, but now any link pattern will
be parsed in regular #SUBURL.
2015-09-11 00:41:15 +03:00
Alex Schroeder
26d3852f30 campaignwiki.org uses HTTPS 2015-09-10 08:36:29 +02:00
Alex Schroeder
ad54fda317 light.css: validator says some stuff is invalid 2015-09-07 10:42:05 +02:00
Aleks-Daniel Jakimenko
dca0c75e34 AGPL for Alexine scripts 2015-09-07 05:00:20 +03:00
Aleks-Daniel Jakimenko
ca0f12697b askpage.pl: forgot to add some variables to our (...) 2015-09-07 03:47:23 +03:00
Aleks-Daniel Jakimenko
b8ae7e0817 askpage.pl: changed according to recent oddmuse changes 2015-09-07 03:42:32 +03:00
Aleks-Daniel Jakimenko
3e91bdc75e Do not tell people to write a comment if they're doing it
“There are no comments, yet. Be the first to leave a comment!” – that's what
you will see when you preview your comment on an empty page.

Since the user is already doing it, there is no need to tell that. Also, it
may look like it is part of the preview.

We no longer do that (after this commit). In other words, the preview should
look exactly like the resulting page.
2015-09-07 03:32:55 +03:00
Alex Schroeder
e25a621e6e Big changes to how diffs are generated
The original issue was that looking at all changes (action=rc all=1) the
resulting diff didn't always make sense if you clicked on the diff link.
It showed the difference between that revision and the current revision.
The PrintHtmlDiff sub was changed significantly to make it easier to
understand and to help fix this issue.

The drawback is that it now requires a new key in page and keep files:
lastmajorsummary. It goes with lastmajor and diff-major and records the
summary for that particular edit. As new changes will start recording
this new key, the change will slowly propagate in existing wikis.
Whenever you look at minor diffs, however, the existing summary key is
chosen. Plus, whenever you want to look at differences between
particular revisions, this is equivalent to looking at minor diffs. So
the only situation that is problematic is an edit history like the
following:

A - major change
B - major change (major diff, major summary, last major revision)
C - minor change

When looking at this page with diff=2, we want to show major diff, major
summary, last major revision. If B happened before this commit was
installed, the summary will be missing.
2015-09-06 14:32:36 +02:00
Alex Schroeder
de6a3f1d0c Add comment label back
Commit:8d4c15e removed our ”Write your comment here:” label. This
commit adds it back.
2015-09-06 08:27:43 +02:00
Alex Schroeder
bf00a9ea04 Merge remote-tracking branch 'origin/return-objects' 2015-09-06 08:10:46 +02:00
Aleks-Daniel Jakimenko
f8ac7a2818 aawrapperdiv.pl: wrap PrintFooter correctly 2015-09-06 02:55:36 +03:00
Aleks-Daniel Jakimenko
1cd33b691c Fix for issue #1 on github
Changing everything to return objects is a worthy goal, but for now we have
taken enough destructive steps towards it. Therefore, this commit fixes the
problem in backwards compatible way (by adding one more parameter to the
signatures).

Note that this additional parameter is NOT a timestamp, it is a whole page
object. Which means that we are still moving towards our goal of using page
objects everywhere, this commit is just doing it in a backwards-compatible
way.
2015-09-06 01:10:29 +03:00
Aleks-Daniel Jakimenko
9d7e5b43c0 Test for Issue #1 on github 2015-09-05 23:54:43 +03:00
Alex Schroeder
ceca41d85c google-plus-one.pl: fix plusone action
Privacy Badger is acting up and I think we're better off creating the
buttons dynamically.
2015-09-04 14:00:44 +02:00
Aleks-Daniel Jakimenko
1c4e082755 Return objects where it begs for it
sub ParseData is fully backwards compatible. If some module runs it in list
context, then it will get listified hash like previously. New code should
always run it in scalar context though (everything in our code base
was changed according to that).

sub GetTextRevision is not backwards compatible (don't let “wantarray” usage
to confuse you). Most modules do not touch that subroutine, so we are probably
fine (modules from our git repo that do use were changed accordingly).

“EncodePage(%$page)” looks wrong. It seems like we should change it to accept
hash ref.
2015-09-04 04:55:48 +03:00
Alex Schroeder
aec340b401 rollback.t: add another 1s sleep
Trying to solve an issue: sometimes the test fails on Alex Daniel's
test server but never on Alex Schroeder's laptop. The output of Recent
Changes being tested has no rollback button for one of the page links.
Actually, the last six edits have no rollback button:

12:34 UTC (diff) MinorPage . . . . 127.0.0.1 – Rollback to 2015-09-01 12:34 UTC (minor)
12:34 UTC (diff) AnotherEvilPage . . . . 127.0.0.1 – Rollback to 2015-09-01 12:34 UTC (minor)
12:34 UTC (diff) OtherPage . . . . 127.0.0.1 – Rollback to 2015-09-01 12:34 UTC
12:34 UTC (diff) NicePage . . . . 127.0.0.1 – Rollback to 2015-09-01 12:34 UTC
12:34 UTC (diff) EvilPage . . . . 127.0.0.1 – Rollback to 2015-09-01 12:34 UTC
12:34 UTC (diff) MinorPage . . . . 127.0.0.1 – testerror (minor)

Note that this includes the "testerror" minor edit which is about to
be rolled back. Perhaps that's because this should hold in
RollbackPossible and it does not: $ts != $LastUpdate. $ts would be the
timestamp of the testerror edit and $LastUpdate would be the timestamp
of the rollback. I've added another 1s sleep between these two.
2015-09-02 13:41:12 +02:00
Alex Schroeder
3ea87c007d The parameter days must be numeric 2015-08-31 11:04:22 +02:00
Alex Schroeder
4d8b028e2d test for wiping comments with "0" and fix 2015-08-29 11:57:29 +02:00
Aleks-Daniel Jakimenko
31c02d6e95 oddmuse-quickstart: some progress 2015-08-26 07:05:44 +03:00
Aleks-Daniel Jakimenko
26bf8a3043 oddmuse-quickstart: progress (still not ready) 2015-08-25 07:09:37 +03:00
Aleks-Daniel Jakimenko
ac21a8e6a4 Group pages with comment pages in page index 2015-08-25 04:14:16 +03:00
Aleks-Daniel Jakimenko
a000937768 https links in README 2015-08-24 03:00:56 +03:00
Aleks-Daniel Jakimenko
4eef4d2d76 No more /o, modifiers sorted alphabetically 2015-08-23 21:22:12 +03:00
Alex Schroeder
92410a1f5c add-link.pl: Fix footer 2015-08-23 13:32:41 +02:00
Aleks-Daniel Jakimenko
aa89d08e08 atom.pl: use XML::Atom explicitly
So that it is easier to find the required dependency
2015-08-20 14:59:00 +03:00
Aleks-Daniel Jakimenko
244ddb5157 run-tests: fixed wrong git path 2015-08-20 07:19:06 +03:00
Aleks-Daniel Jakimenko
9c3456c963 run-tests: do push as well 2015-08-20 06:54:41 +03:00
Aleks-Daniel Jakimenko
ad9afbf5ba GPL license for Alexine scripts 2015-08-20 06:47:53 +03:00
Aleks-Daniel Jakimenko
bc079133f7 New script new-release (autoupdate source links) 2015-08-20 06:45:32 +03:00
Aleks-Daniel Jakimenko
69a0f3ed23 Alexine image 2015-08-19 11:28:30 +03:00
Aleks-Daniel Jakimenko
1fc3600329 run-tests: print only 7 characters of a commit 2015-08-19 11:19:31 +03:00
Aleks-Daniel Jakimenko
c1141cd610 run-tests: another repository link
This repository will not only hold test data, but it
will also have some other files associated with Alexine bot.
2015-08-19 11:12:36 +03:00
Aleks-Daniel Jakimenko
300d86b2cd run-tests: fixed newlines 2015-08-19 11:10:48 +03:00
Aleks-Daniel Jakimenko
d609a857c0 run-tests: fixed typo, OK status edits are now minor 2015-08-19 10:57:31 +03:00
Aleks-Daniel Jakimenko
8e98298777 run-tests: fix wikiput path 2015-08-19 10:47:49 +03:00
Aleks-Daniel Jakimenko
0642fad8f8 Afterfix for 5462b21 (disallow minor comments)
Test added as well
2015-08-19 10:17:49 +03:00
Aleks-Daniel Jakimenko
d10d76c475 run-tests: secret key specified 2015-08-19 10:05:45 +03:00
Aleks-Daniel Jakimenko
8aa2f04995 New run-tests script (part of Alexine) 2015-08-19 08:53:21 +03:00
217 changed files with 14448 additions and 9321 deletions

4
.gitignore vendored
View File

@@ -1,8 +1,10 @@
*~
/build/
\#*\#
/test-data
/test-data*
/Mac/pkg/
*.dmg
*.pkg
.DS_Store
wiki.log
.prove

View File

@@ -3,7 +3,7 @@
# subdirectory.
VERSION_NO=$(shell git describe --tags)
TRANSLATIONS=$(wildcard modules/translations/[a-z]*.pl$)
TRANSLATIONS=$(wildcard modules/translations/[a-z]*-utf8.pl$)
MODULES=$(wildcard modules/*.pl)
BUILD=build/wiki.pl $(foreach file, $(notdir $(MODULES)) $(notdir $(TRANSLATIONS)), build/$(file))
@@ -19,9 +19,13 @@ build:
clean:
rm -rf build
prove t/setup.pl
release:
perl stuff/release ~/oddmuse.org
build/wiki.pl: wiki.pl
perl -lne "s/(\\\$$q->a\({-href=>'http:\/\/www.oddmuse.org\/'}, 'Oddmuse'\))/\\\$$q->a({-href=>'http:\/\/git.savannah.gnu.org\/cgit\/oddmuse.git\/tag\/?id=$(VERSION_NO)'}, 'wiki.pl') . ' ($(VERSION_NO)), see ' . \$$1/; print" < $< > $@
perl -lne "s/(\\\$$q->a\(\{-href=>'http:\/\/www.oddmuse.org\/'\}, 'Oddmuse'\))/\\\$$q->a({-href=>'http:\/\/git.savannah.gnu.org\/cgit\/oddmuse.git\/tag\/?id=$(VERSION_NO)'}, 'wiki.pl') . ' ($(VERSION_NO)), see ' . \$$1/; print" < $< > $@
build/%-utf8.pl: modules/translations/%-utf8.pl
perl -lne "s/(AddModuleDescription\('[^']+', '[^']+')\)/\$$1, 'translations\/', '$(VERSION_NO)')/; print" < $< > $@
@@ -38,11 +42,21 @@ build/month-names-%.pl: modules/translations/month-names-%.pl
build/%.pl: modules/%.pl
perl -lne "s/(AddModuleDescription\('[^']+', '[^']+')\)/\$$1, undef, '$(VERSION_NO)')/; print" < $< > $@
modules/translations/new-utf8.pl: wiki.pl $(MODULES)
cp $@ $@-old
perl stuff/oddtrans -l $@-old wiki.pl $(MODULES) > $@
rm -f $@-old
translations: $(TRANSLATIONS)
for f in $^; do \
echo updating $$f...; \
perl oddtrans -l $$f wiki.pl $(MODULES) > $$f-new && mv $$f-new $$f; \
perl stuff/oddtrans -l $$f wiki.pl $(MODULES) > $$f-new && mv $$f-new $$f; \
done
# Running four jobs in parallel, but clean up data directories without
# race conditions!
jobs ?= 4
test:
prove t
prove t/setup.pl
prove --jobs=$(jobs) --state=slow,save t

View File

@@ -1,5 +1,5 @@
This is the README file distributed together with the
[[http://oddmuse.org/|Oddmuse]] script.
[[https://oddmuse.org/|Oddmuse]] script.
== Installing Oddmuse on a Debian System running Apache
@@ -82,7 +82,7 @@ putting their names in {{{[[double square brackets]]}}}.
Enjoy your wiki experience.
Visit http://www.oddmuse.org/ to learn more about the translation
Visit https://www.oddmuse.org/ to learn more about the translation
files and modules that are part of this package.
== Apache
@@ -136,7 +136,7 @@ simply restart it all:
sudo service apache2 graceful
}}}
----------------------------------------------------------------------
== License
Permission is granted to copy, distribute and/or modify this document
under the terms of the GNU Free Documentation License, Version 1.1 or
@@ -153,5 +153,7 @@ MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
General Public License for more details.
Both the GNU Free Documentation License, and the GNU General Public
License are distributed together with this script. See the files FDL
and GPL, respectively.
License are distributed together with this script. See the files
[[https://github.com/kensanata/oddmuse/blob/master/FDL|FDL]] and
[[https://github.com/kensanata/oddmuse/blob/master/GPL|GPL]],
respectively.

View File

@@ -1,4 +1,4 @@
The files in this directory are used to run http://campaignwiki.org/
The files in this directory are used to run https://campaignwiki.org/
add-link.pl
===========
@@ -8,7 +8,7 @@ bookmark site: A few pages make up a big unordered list of links in
wiki format. add-link is a tool to help users contribute new links to
the list.
http://campaignwiki.org/wiki/LinksToWisdom/HomePage
https://campaignwiki.org/wiki/LinksToWisdom/HomePage
copy.pl
=======
@@ -17,7 +17,7 @@ This is used to copy the text from a web page to a wiki page. The idea
was to keep archive copies of cool pages somewhere. The Blog Archive
never got used, though.
http://campaignwiki.org/wiki/BlogArchive/HomePage
https://campaignwiki.org/wiki/BlogArchive/HomePage
monster-tag.pl
==============
@@ -25,7 +25,7 @@ monster-tag.pl
This is used to quickly tag many pages in the Monsters wiki. The
Monsters wiki hasn't been used in a long time, though.
http://campaignwiki.org/wiki/Monsters/HomePage
https://campaignwiki.org/wiki/Monsters/HomePage
submit.pl
=========
@@ -34,4 +34,4 @@ This used to be used to add sites to the Old School RPG Planet. The
aggregator was configured via a wiki page on the Planet wiki. It's now
abandoned.
http://campaignwiki.org/wiki/Planet/HomePage
https://campaignwiki.org/wiki/Planet/HomePage

View File

@@ -292,6 +292,7 @@ sub main {
Init(); # read config file (no modules!)
$ScriptName = $site; # undo setting in the config file
$FullUrl = $site; #
InitPageVariables(); # call again: $ScriptName was wrong
binmode(STDOUT,':utf8');
$q->charset('utf8');
if ($q->path_info eq '/source') {

14
contrib/campaignwiki/delete.sh Executable file
View File

@@ -0,0 +1,14 @@
#!/bin/bash
if test -z "$2" -o ! -z "$3"; then
echo "Usage: delete.sh USERNAME WIKI"
exit 1
fi
username=$1
wiki=$2
for p in $(curl "https://campaignwiki.org/wiki/$wiki?action=index;raw=1"); do
echo "Deleting: $p"
curl -F frodo=1 -F "title=$p" -F text=DeletedPage -F summary=Deleted -F username="$username" "https://campaignwiki.org/wiki/$wiki"
sleep 5
done

131
contrib/no-flickr.pl Normal file
View File

@@ -0,0 +1,131 @@
#! /usr/bin/perl -w
# Copyright (C) 2005-2016 Alex Schroeder <alex@gnu.org>
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation; either version 3 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program. If not, see <http://www.gnu.org/licenses/>.
use Modern::Perl;
use LWP::UserAgent;
use utf8;
binmode(STDOUT, ":utf8");
my $ua = LWP::UserAgent->new;
sub url_encode {
my $str = shift;
return '' unless $str;
utf8::encode($str); # turn to byte string
my @letters = split(//, $str);
my %safe = map {$_ => 1} ('a' .. 'z', 'A' .. 'Z', '0' .. '9', '-', '_', '.', '!', '~', '*', "'", '(', ')', '#');
foreach my $letter (@letters) {
$letter = sprintf("%%%02x", ord($letter)) unless $safe{$letter};
}
return join('', @letters);
}
sub get_raw {
my $uri = shift;
my $response = $ua->get($uri);
return $response->content if $response->is_success;
}
sub get_wiki_page {
my ($wiki, $id, $password) = @_;
my $parameters = [
pwd => $password,
action => 'browse',
id => $id,
raw => 1,
];
my $response = $ua->post($wiki, $parameters);
return $response->decoded_content if $response->is_success;
die "Getting $id returned " . $response->status_line;
}
sub get_wiki_index {
my $wiki = shift;
my $parameters = [
search => "flickr.com",
context => 0,
raw => 1,
];
my $response = $ua->post($wiki, $parameters);
return $response->decoded_content if $response->is_success;
die "Getting the index returned " . $response->status_line;
}
sub post_wiki_page {
my ($wiki, $id, $username, $password, $text) = @_;
my $parameters = [
username => $username,
pwd => $password,
recent_edit => 'on',
text => $text,
title => $id,
];
my $response = $ua->post($wiki, $parameters);
die "Posting to $id returned " . $response->status_line unless $response->code == 302;
}
my %seen = ();
sub write_flickr {
my ($id, $flickr, $dir, $file) = @_;
say "Found $flickr";
warn "$file was seen before: " . $seen{$file} if $seen{$file};
die "$file contains unknown characters" if $file =~ /[^a-z0-9_.]/;
$seen{$file} = "$id used $flickr";
my $bytes = get_raw($flickr) or die("No data for $id");
open(my $fh, '>', "$dir/$file") or die "Cannot write $dir/$file";
binmode($fh);
print $fh $bytes;
close($fh);
}
sub convert_page {
my ($wiki, $pics, $dir, $username, $password, $id) = @_;
say $id;
my $text = get_wiki_page($wiki, $id, $password);
my $is_changed = 0;
while ($text =~ m!(https://[a-z0-9.]+.flickr.com/(?:[a-z0-9.]+/)?([a-z0-9_]+\.(?:jpg|png)))!) {
my $flickr = $1;
my $file = $2;
write_flickr($id, $flickr, $dir, $file);
$is_changed = 1;
my $re = quotemeta($flickr);
$text =~ s!$flickr!$pics/$file!g;
}
if ($is_changed) {
post_wiki_page($wiki, $id, $username, $password, $text);
} else {
# die "$id has no flickr matches?\n$text";
}
sleep(5);
}
sub convert_site {
my ($wiki, $pics, $dir, $username, $password) = @_;
my @ids = split(/\n/, get_wiki_index($wiki));
for my $id (@ids) {
convert_page($wiki, $pics, $dir, $username, $password, $id);
}
}
our $AdminPass;
do "/home/alex/password.pl";
convert_site('https://alexschroeder.ch/wiki',
'https://alexschroeder.ch/pics',
'/home/alex/alexschroeder.ch/pics',
'Alex Schroeder',
$AdminPass);

View File

@@ -38,9 +38,10 @@
;;; Code:
(eval-when-compile
(require 'cl)
(require 'sgml-mode)
(require 'skeleton))
'(progn
(require 'cl)
(require 'sgml-mode)
(require 'skeleton)))
(require 'goto-addr); URL regexp
(require 'info); link face
@@ -257,24 +258,6 @@ Example:
(defvar oddmuse-revision nil
"A variable to bind dynamically when calling `oddmuse-format-command'.")
(defun oddmuse-revision-put (wiki page rev)
"Store REV for WIKI and PAGE in `oddmuse-revisions'."
(let ((w (assoc wiki oddmuse-revisions)))
(unless w
(setq w (list wiki)
oddmuse-revisions (cons w oddmuse-revisions)))
(let ((p (assoc page w)))
(unless p
(setq p (list page))
(setcdr w (cons p (cdr w))))
(setcdr p rev))))
(defun oddmuse-revision-get (wiki page)
"Get revision for WIKI and PAGE in `oddmuse-revisions'."
(let ((w (assoc wiki oddmuse-revisions)))
(when w
(cdr (assoc page w)))))
;;; Helpers
(defsubst oddmuse-page-name (file)
@@ -300,7 +283,7 @@ Example:
(defun oddmuse-url (wiki pagename)
"Get the URL of oddmuse wiki."
(condition-case v
(concat (or (cadr (assoc wiki oddmuse-wikis)) (error)) "/"
(concat (or (cadr (assoc wiki oddmuse-wikis)) (error "Wiki not found in `oddmuse-wikis'")) "/"
(url-hexify-string pagename))
(error nil)))
@@ -534,7 +517,7 @@ as well."
((string-match "<title>Error</title>" status)
(if (string-match "<h1>\\(.*\\)</h1>" status)
(error "Error %s: %s" mesg (match-string 1 status))
(error "Error %s: Cause unknown")))
(error "Error %s: Cause unknown" status)))
(t
(message "%s...done" mesg))))))
@@ -736,7 +719,7 @@ Font-locking is controlled by `oddmuse-markup-functions'.
(set (make-local-variable 'sgml-tag-alist)
`(("b") ("code") ("em") ("i") ("strong") ("nowiki")
("pre" \n) ("tt") ("u")))
(set (make-local-variable 'skeleton-transformation) 'identity)
(set (make-local-variable 'skeleton-transformation-function) 'identity)
(make-local-variable 'oddmuse-wiki)
(make-local-variable 'oddmuse-page-name)
@@ -854,11 +837,8 @@ people have been editing the wiki in the mean time."
(set-buffer (get-buffer-create name))
(erase-buffer); in case of current-prefix-arg
(oddmuse-run "Loading" oddmuse-get-command wiki pagename)
(oddmuse-revision-put wiki pagename (oddmuse-get-latest-revision wiki pagename))
;; fix mode-line for VC in the new buffer because this is not a vc-checkout
(setq buffer-file-name (concat oddmuse-directory "/" wiki "/" pagename))
(vc-mode-line buffer-file-name 'oddmuse)
(pop-to-buffer (current-buffer))
(vc-working-revision buffer-file-name 'oddmuse)
;; check for a diff (this ends with display-buffer) and bury the
;; buffer if there are no hunks
(when (file-exists-p buffer-file-name)
@@ -869,7 +849,9 @@ people have been editing the wiki in the mean time."
;; this also changes the buffer name
(basic-save-buffer)
;; this makes sure that the buffer name is set correctly
(oddmuse-mode))))
(oddmuse-mode)
;; fix mode-line for VC in the new buffer because this is not a vc-checkout
(vc-mode-line buffer-file-name 'oddmuse))))
(defalias 'oddmuse-go 'oddmuse-edit)
@@ -909,8 +891,11 @@ Use a prefix argument to override this."
(and buffer-file-name (basic-save-buffer))
(oddmuse-run "Posting" oddmuse-post-command nil nil
(get-buffer-create " *oddmuse-response*") t 302)
(oddmuse-revision-put oddmuse-wiki oddmuse-page-name
(oddmuse-get-latest-revision oddmuse-wiki oddmuse-page-name)))
;; force reload
(vc-file-setprop buffer-file-name 'vc-working-revision
(oddmuse-get-latest-revision oddmuse-wiki oddmuse-page-name))
;; fix mode-line for VC in the new buffer because this is not a vc-checkout
(vc-mode-line buffer-file-name 'oddmuse))
;;;###autoload
(defun oddmuse-preview (&optional arg)

View File

@@ -47,7 +47,8 @@ For a list of possible values, see `vc-state'."
(defun vc-oddmuse-working-revision (file)
"The current revision based on `oddmuse-revisions'."
(oddmuse-revision-get oddmuse-wiki oddmuse-page-name))
(with-oddmuse-file
(oddmuse-get-latest-revision wiki pagename)))
(defun vc-oddmuse-checkout-model (files)
"No locking."
@@ -59,10 +60,6 @@ For a list of possible values, see `vc-state'."
(defun vc-oddmuse-register (files &optional rev comment)
"This always works.")
(defun vc-oddmuse-revert (file &optional contents-done)
"No idea"
nil)
(defvar vc-oddmuse-log-command
(concat "curl --silent %w"
" --form action=rc"
@@ -149,7 +146,7 @@ a version backup."
(with-oddmuse-file file
(let ((command (oddmuse-format-command vc-oddmuse-get-revision-command)))
(with-temp-buffer
(oddmuse-run "Loading" command)
(oddmuse-run "Loading" command wiki pagename)
(write-file file))))))
(defun vc-oddmuse-checkin (files rev comment)

View File

@@ -1,110 +1,7 @@
/* This file is in the public domain. */
/* @import url(https://fonts.googleapis.com/css?family=Noticia+Text:400,400italic,700italic,700&subset=latin,latin-ext); */
/* vietnamese */
@font-face {
font-family: 'Noticia Text';
font-style: normal;
font-weight: 400;
src: local('Noticia Text'), local('NoticiaText-Regular)'), url('/fonts/NoticiaText-Regular.woff') format('woff');
unicode-range: U+0102-0103, U+1EA0-1EF1, U+20AB;
}
/* latin-ext */
@font-face {
font-family: 'Noticia Text';
font-style: normal;
font-weight: 400;
src: local('Noticia Text'), local('NoticiaText-Regular)'), url('/fonts/NoticiaText-Regular.woff') format('woff');
unicode-range: U+0100-024F, U+1E00-1EFF, U+20A0-20AB, U+20AD-20CF, U+2C60-2C7F, U+A720-A7FF;
}
/* latin */
@font-face {
font-family: 'Noticia Text';
font-style: normal;
font-weight: 400;
src: local('Noticia Text'), local('NoticiaText-Regular)'), url('/fonts/NoticiaText-Regular.woff') format('woff');
unicode-range: U+0000-00FF, U+0131, U+0152-0153, U+02C6, U+02DA, U+02DC, U+2000-206F, U+2074, U+20AC, U+2212, U+2215, U+E0FF, U+EFFD, U+F000;
}
/* vietnamese */
@font-face {
font-family: 'Noticia Text';
font-style: normal;
font-weight: 700;
src: local('Noticia Text Bold'), local('NoticiaText-Bold)'), url('/fonts/NoticiaText-Bold.woff') format('woff');
unicode-range: U+0102-0103, U+1EA0-1EF1, U+20AB;
}
/* latin-ext */
@font-face {
font-family: 'Noticia Text';
font-style: normal;
font-weight: 700;
src: local('Noticia Text Bold'), local('NoticiaText-Bold)'), url('/fonts/NoticiaText-Bold.woff') format('woff');
unicode-range: U+0100-024F, U+1E00-1EFF, U+20A0-20AB, U+20AD-20CF, U+2C60-2C7F, U+A720-A7FF;
}
/* latin */
@font-face {
font-family: 'Noticia Text';
font-style: normal;
font-weight: 700;
src: local('Noticia Text Bold'), local('NoticiaText-Bold)'), url('/fonts/NoticiaText-Bold.woff') format('woff');
unicode-range: U+0000-00FF, U+0131, U+0152-0153, U+02C6, U+02DA, U+02DC, U+2000-206F, U+2074, U+20AC, U+2212, U+2215, U+E0FF, U+EFFD, U+F000;
}
/* vietnamese */
@font-face {
font-family: 'Noticia Text';
font-style: italic;
font-weight: 400;
src: local('Noticia Text Italic'), local('NoticiaText-Italic)'), url('/fonts/NoticiaText-Italic.woff') format('woff');
unicode-range: U+0102-0103, U+1EA0-1EF1, U+20AB;
}
/* latin-ext */
@font-face {
font-family: 'Noticia Text';
font-style: italic;
font-weight: 400;
src: local('Noticia Text Italic'), local('NoticiaText-Italic)'), url('/fonts/NoticiaText-Italic.woff') format('woff');
unicode-range: U+0100-024F, U+1E00-1EFF, U+20A0-20AB, U+20AD-20CF, U+2C60-2C7F, U+A720-A7FF;
}
/* latin */
@font-face {
font-family: 'Noticia Text';
font-style: italic;
font-weight: 400;
src: local('Noticia Text Italic'), local('NoticiaText-Italic)'), url('/fonts/NoticiaText-Italic.woff') format('woff');
unicode-range: U+0000-00FF, U+0131, U+0152-0153, U+02C6, U+02DA, U+02DC, U+2000-206F, U+2074, U+20AC, U+2212, U+2215, U+E0FF, U+EFFD, U+F000;
}
/* vietnamese */
@font-face {
font-family: 'Noticia Text';
font-style: italic;
font-weight: 700;
src: local('Noticia Text Bold Italic'), local('NoticiaText-BoldItalic)'), url('/fonts/NoticiaText-BoldItalic.woff') format('woff');
unicode-range: U+0102-0103, U+1EA0-1EF1, U+20AB;
}
/* latin-ext */
@font-face {
font-family: 'Noticia Text';
font-style: italic;
font-weight: 700;
src: local('Noticia Text Bold Italic'), local('NoticiaText-BoldItalic)'), url('/fonts/NoticiaText-BoldItalic.woff') format('woff');
unicode-range: U+0100-024F, U+1E00-1EFF, U+20A0-20AB, U+20AD-20CF, U+2C60-2C7F, U+A720-A7FF;
}
/* latin */
@font-face {
font-family: 'Noticia Text';
font-style: italic;
font-weight: 700;
src: local('Noticia Text Bold Italic'), local('NoticiaText-BoldItalic)'), url('/fonts/NoticiaText-BoldItalic.woff') format('woff');
unicode-range: U+0000-00FF, U+0131, U+0152-0153, U+02C6, U+02DA, U+02DC, U+2000-206F, U+2074, U+20AC, U+2212, U+2215, U+E0FF, U+EFFD, U+F000;
}
@font-face {
font-family: 'Symbola';
src: local('Symbola'), url('/fonts/Symbola.woff') format('woff'), url('/fonts/Symbola.ttf') format('truetype');
}
body, rss {
font-family: "Noticia Text", Symbola, serif;
font-family: "Palatino Linotype", "Book Antiqua", Palatino, serif;
font-style: normal;
font-size: 14pt;
margin: 1em 3em;

536
css/alex-2016.css Normal file
View File

@@ -0,0 +1,536 @@
/* This file is in the public domain. */
html{ text-align: center; }
body, rss {
font-family: "Palatino Linotype", "Book Antiqua", Palatino, serif;
font-style: normal;
font-size: 14pt;
padding: 1em 3em;
max-width: 72ex;
display: inline-block;
text-align: left;
color: #000;
background-color: #fff;
}
@media print {
body {
font-size: 12pt;
}
/* hide all the crap */
div.diff, div.diff+hr, div.refer, div.near, div.definition, div.sister,
div.cal, div.footer, span.specialdays, span.gotobar, a.edit, a.number span,
div.rc form, form.tiny, p.comment, p#plus1, div.g-plusone, div.content a.feed {
display:none;
}
div.content a.book,
div.content a.movie {
text-decoration: none;
}
a cite {
font-style: italic;
}
img[alt="RSS"] { display: none }
a.rss { font-size: 8pt }
}
/* headings: we can use larger sizes if we use a lighter color.
we cannot inherit the font-family because header and footer use a narrow font. */
h1, h2, h3, title {
font-family: inherit;
font-weight: normal;
}
h1, channel title {
font-size: 32pt;
margin: 1em 0 0.5em 0;
padding: 0.4em 0;
}
h2 {
font-size: 18pt;
margin: 2em 0 0 0;
padding: 0;
}
h3 {
font-size: inherit;
font-weight: bold;
padding: 0;
margin: 1em 0 0 0;
clear: both;
}
/* headers in the journal are smaller */
div.journal h1, item title {
font-size: inherit;
padding: 0;
clear: both;
border-bottom: 1px solid #000;
}
div.journal h2 {
font-family: inherit;
font-size: inherit;
}
div.journal h3 {
font-family: inherit;
font-size: inherit;
font-weight: inherit;
font-style: italic;
}
div.journal hr {
visibility: hidden;
}
p.more {
margin-top: 3em;
}
/* Links in headings appear on journal pages. */
h1 a, h2 a, h3 a {
color:inherit;
text-decoration:none;
font-weight: normal;
}
h1 a:visited, h2 a:visited, h3 a:visited {
color: inherit;
}
/* for download buttons and the like */
.button {
display: inline-block;
font-size: 120%;
cursor: pointer;
padding: 0.4em 0.6em;
text-shadow: 0px -1px 0px #ccc;
background-color: #cfa;
border: 1px solid #9d8;
border-radius: 5px;
box-shadow: 0px 1px 3px white inset, 0px 1px 3px black;
}
.button .icon {
color: #363;
text-shadow: 0px -1px 1px white, 0px 1px 3px #666;
}
.button a {
text-decoration: none;
font-weight: normal;
}
/* links */
a.pencil {
padding-left: 1ex;
text-decoration: none;
color: inherit;
visibility: hidden;
transition: visibility 0s 1s, opacity 1s linear;
opacity: 0;
}
*:hover > a.pencil {
visibility: visible;
transition: opacity .5s linear;
opacity: 1;
}
@media print {
a.pencil {
display: none;
}
}
a.number {
text-decoration: none;
}
/* stop floating content from flowing over the footer */
hr {
clear: both;
}
/* the distance between links in the navigation bars */
span.bar a {
margin-right: 1ex;
}
a img {
border: none;
}
/* search box in the top bar */
.header form, .header p {
display: inline;
white-space: nowrap;
}
label[for="searchlang"], #searchlang, .header input[type="submit"] {
/* don't use display: none! http://stackoverflow.com/questions/5665203/getting-iphone-go-button-to-submit-form */
visibility: hidden; position: absolute;
}
/* wrap on the iphone */
@media only screen and (max-device-width: 480px) {
}
.header input {
width: 10ex;
}
/* other form fields */
input[type="text"] {
padding: 0;
font-size: 80%;
line-height: 125%;
}
/* code */
textarea, pre, code, tt {
font-family: "Andale Mono", Monaco, "Courier New", Courier, monospace, "Symbola";
font-size: 80%;
}
pre {
overflow:hidden;
white-space: pre-wrap; /* CSS 3 */
white-space: -moz-pre-wrap; /* Mozilla, since 1999 */
white-space: -pre-wrap; /* Opera 4-6 */
white-space: -o-pre-wrap; /* Opera 7 */
word-wrap: break-word; /* Internet Explorer 5.5+ */
}
/* styling for divs that will be invisible when printing
when printing. */
div.header, div.footer, div.near, div.definition, p.comment, a.tag {
font-size: 14pt;
}
@media print {
div.header, div.footer, div.near, div.definition, p.comment, a.tag {
font-size: 8pt;
}
}
div.footer form.search {
display: none;
}
div.rc li + li {
margin-top: 1em;
}
div.rc li strong, table.history strong, strong.description {
font-family: inherit;
font-weight: inherit;
}
div.diff {
padding-left: 5%;
padding-right: 5%;
font-size: 12pt;
color: #000;
}
div.old {
background-color: #ffffaf;
}
div.new {
background-color: #cfffcf;
}
div.refer {
padding-left: 5%;
padding-right: 5%;
font-size: 12pt;
}
div.message {
background-color:#fee;
color:#000;
}
img.xml {
border:none;
padding:1px;
}
a.small img {
max-width:300px;
}
a.large img {
max-width:600px;
}
div.sister {
margin-right:1ex;
background-color:inherit;
}
div.sister p {
margin-top:0;
}
div.sister hr {
display:none;
}
div.sister img {
border:none;
}
div.near, div.definition {
background-color:#efe;
}
div.sidebar {
float:right;
border:1px dotted #000;
padding:0 1em;
}
div.sidebar ul {
padding-left:1em;
}
/* replacements, features */
ins {
font-style: italic;
text-decoration: none;
}
acronym, abbr {
letter-spacing:0.1em;
font-variant:small-caps;
}
/* Interlink prefix not shown */
a .site, a .separator {
display: none;
}
a cite { font:inherit; }
/* browser borkage */
textarea[name="text"] { width:97%; height:80%; }
textarea[name="summary"] { width:97%; height:3em; }
/* comments */
textarea[name="aftertext"] { width:97%; height:10em; }
div.commentshown {
font-size: 12pt;
padding: 2em 0;
}
div.commenthidden {
display:none;
}
div.commentshown {
display:block;
}
p.comment {
margin-bottom: 0;
}
div.comment {
font-size: 14pt;
}
div.comment h2 {
margin-top: 5em;
}
/* comment pages with username, homepage, and email subscription */
.comment form span { display: block; }
.comment form span label { display: inline-block; width: 10em; }
/* IE sucks */
.comment input#username,
.comment input#homepage,
.comment input#mail { width: 20em; }
/* cal */
div.month { padding:0; margin:0 2ex; }
body > div.month {
float:right;
background-color: inherit;
border:solid thin;
padding:0 1ex;
}
.year > .month {
float:left;
}
.footer {
clear:both;
}
.month .title a.local {
background-color: inherit;
}
.month a.local {
background-color: #ddf;
}
.month a.today {
background-color: #fdd;
}
.month a {
color:inherit;
font-weight:inherit;
text-decoration: none;
background-color: #eee;
}
/* history tables and other tables */
table.history {
border: none;
}
td.history {
border: none;
}
table.user {
border: none;
border-top: 1px solid #ccc;
border-bottom: 1px solid #ccc;
padding: 1em;
margin: 1em 2em;
}
table.user tr td, table.user tr th {
border: none;
padding: 0.2em 0.5em;
vertical-align: top;
}
table.arab tr th {
font-weight:normal;
text-align:left;
vertical-align:top;
}
table.arab, table.arab tr th, table.arab tr td {
border:none;
}
th.nobreak {
white-space:nowrap;
}
table.full { width:99%; margin-left:1px; }
table.j td, table.j th, table tr td.j, table tr th.j, .j { text-align:justify; }
table.l td, table.l th, table tr td.l, table tr th.l, .l { text-align:left; }
table.r td, table.r th, table tr td.r, table tr th.r, .r { text-align:right; }
table.c td, table.c th, table tr td.c, table tr th.c, .c { text-align:center; }
table.t td { vertical-align: top; }
td.half { width:50%; }
td.third { width:33%; }
form table td { padding:5px; }
/* lists */
dd { padding-bottom:0.5ex; }
dl.inside dt { float:left; }
/* search */
div.search span.result { font-size:larger; }
div.search span.info { font-size:smaller; font-style:italic; }
div.search p.result { display:none; }
img.logo {
float: right;
margin: 0 0 0 1ex;
padding: 0;
border: 1px solid #000;
opacity: 0.3;
background-color:#ffe;
}
/* images */
div.content a.feed img, div.journal a.feed img,
div.content a img.smiley, div.journal a img.smiley, img.smiley,
div.content a.inline img, div.journal a.inline img,
div.content li a.image img, div.journal li a.image img {
margin: 0; padding: 0; border: none;
}
div.image a img {
margin-bottom: 0;
}
div.image span.caption {
margin: 0 1em;
}
img {
max-width: 100%;
}
.left { float:left; margin-right: 1em; }
.right { float:right; margin-left: 1em; }
.half img { height: 50%; width: 50%; }
.face img { width: 200px; }
div.left .left, div.right .right {
float:none;
}
.center { text-align:center; }
table.aside {
float:right;
width:40%;
margin-left: 1em;
padding: 1ex;
border: 1px dotted #666;
}
table.aside td {
text-align:left;
}
div.sidebar {
float:right; width: 250px;
text-align: right;
border: none;
margin: 1ex;
}
.bigsidebar {
float:right;
width: 500px;
border: none;
margin-left: 1ex;
font-size: 80%;
}
dl.irc dt { width:20ex; float:left; text-align:right; clear:left; }
dl.irc dt span.time { float:left; }
dl.irc dd { margin-left:22ex; }
/* portrait */
div.footer, div.comment, hr { clear: both; }
.portrait { float: left; font-size: small; margin-right: 1em; }
.portrait a { color: #999; }
div.left { float:left; margin:1em; padding: 0.5em; }
div.left p { display:table-cell; }
div.left p + p { display:table-caption; caption-side:bottom; }
p.table a { float:left; width:20ex; }
p.table + p { clear:both; }
/* rss */
channel * { display: block; }
channel title {
margin-top: 30pt;
}
copyright {
font-size: 14pt;
margin-top: 1em;
}
channel > link:before {
font-size: 18pt;
display: block;
margin: 1em;
padding: 0.5em;
content: "This is an RSS feed, designed to be read in a feed reader.";
color: red;
border: 1px solid red;
}
link, license {
font-size: 11pt;
margin-bottom: 9pt;
}
username:before { content: "Last edited by "; }
username:after { content: "."; }
generator:before { content: "Feed generated by "; }
generator:after { content: "."; }
channel description {
font-weight: bold;
}
item description {
font-style: italic;
font-weight: normal;
margin-bottom: 1em;
}
docs, language,
pubDate, lastBuildDate, ttl, guid, category, comments,
docs, image title, image link,
status, version, diff, history, importance {
display: none;
}

View File

@@ -321,7 +321,6 @@ div.sister {
float:left;
margin-right:1ex;
padding-right:1ex;
border-right:1px dashed;
}
div.sister p { padding:1ex; margin:0; }
div.sister hr { display:none; }

View File

@@ -8,110 +8,8 @@
@import url(https://fonts.googleapis.com/css?family=Noticia+Text:400,400italic,700italic,700&subset=latin,latin-ext); */
/* vietnamese */
@font-face {
font-family: 'Noticia Text';
font-style: normal;
font-weight: 400;
src: local('Noticia Text'), local('NoticiaText-Regular)'), url('/fonts/NoticiaText-Regular.woff') format('woff');
unicode-range: U+0102-0103, U+1EA0-1EF1, U+20AB;
}
/* latin-ext */
@font-face {
font-family: 'Noticia Text';
font-style: normal;
font-weight: 400;
src: local('Noticia Text'), local('NoticiaText-Regular)'), url('/fonts/NoticiaText-Regular.woff') format('woff');
unicode-range: U+0100-024F, U+1E00-1EFF, U+20A0-20AB, U+20AD-20CF, U+2C60-2C7F, U+A720-A7FF;
}
/* latin */
@font-face {
font-family: 'Noticia Text';
font-style: normal;
font-weight: 400;
src: local('Noticia Text'), local('NoticiaText-Regular)'), url('/fonts/NoticiaText-Regular.woff') format('woff');
unicode-range: U+0000-00FF, U+0131, U+0152-0153, U+02C6, U+02DA, U+02DC, U+2000-206F, U+2074, U+20AC, U+2212, U+2215, U+E0FF, U+EFFD, U+F000;
}
/* vietnamese */
@font-face {
font-family: 'Noticia Text';
font-style: normal;
font-weight: 700;
src: local('Noticia Text Bold'), local('NoticiaText-Bold)'), url('/fonts/NoticiaText-Bold.woff') format('woff');
unicode-range: U+0102-0103, U+1EA0-1EF1, U+20AB;
}
/* latin-ext */
@font-face {
font-family: 'Noticia Text';
font-style: normal;
font-weight: 700;
src: local('Noticia Text Bold'), local('NoticiaText-Bold)'), url('/fonts/NoticiaText-Bold.woff') format('woff');
unicode-range: U+0100-024F, U+1E00-1EFF, U+20A0-20AB, U+20AD-20CF, U+2C60-2C7F, U+A720-A7FF;
}
/* latin */
@font-face {
font-family: 'Noticia Text';
font-style: normal;
font-weight: 700;
src: local('Noticia Text Bold'), local('NoticiaText-Bold)'), url('/fonts/NoticiaText-Bold.woff') format('woff');
unicode-range: U+0000-00FF, U+0131, U+0152-0153, U+02C6, U+02DA, U+02DC, U+2000-206F, U+2074, U+20AC, U+2212, U+2215, U+E0FF, U+EFFD, U+F000;
}
/* vietnamese */
@font-face {
font-family: 'Noticia Text';
font-style: italic;
font-weight: 400;
src: local('Noticia Text Italic'), local('NoticiaText-Italic)'), url('/fonts/NoticiaText-Italic.woff') format('woff');
unicode-range: U+0102-0103, U+1EA0-1EF1, U+20AB;
}
/* latin-ext */
@font-face {
font-family: 'Noticia Text';
font-style: italic;
font-weight: 400;
src: local('Noticia Text Italic'), local('NoticiaText-Italic)'), url('/fonts/NoticiaText-Italic.woff') format('woff');
unicode-range: U+0100-024F, U+1E00-1EFF, U+20A0-20AB, U+20AD-20CF, U+2C60-2C7F, U+A720-A7FF;
}
/* latin */
@font-face {
font-family: 'Noticia Text';
font-style: italic;
font-weight: 400;
src: local('Noticia Text Italic'), local('NoticiaText-Italic)'), url('/fonts/NoticiaText-Italic.woff') format('woff');
unicode-range: U+0000-00FF, U+0131, U+0152-0153, U+02C6, U+02DA, U+02DC, U+2000-206F, U+2074, U+20AC, U+2212, U+2215, U+E0FF, U+EFFD, U+F000;
}
/* vietnamese */
@font-face {
font-family: 'Noticia Text';
font-style: italic;
font-weight: 700;
src: local('Noticia Text Bold Italic'), local('NoticiaText-BoldItalic)'), url('/fonts/NoticiaText-BoldItalic.woff') format('woff');
unicode-range: U+0102-0103, U+1EA0-1EF1, U+20AB;
}
/* latin-ext */
@font-face {
font-family: 'Noticia Text';
font-style: italic;
font-weight: 700;
src: local('Noticia Text Bold Italic'), local('NoticiaText-BoldItalic)'), url('/fonts/NoticiaText-BoldItalic.woff') format('woff');
unicode-range: U+0100-024F, U+1E00-1EFF, U+20A0-20AB, U+20AD-20CF, U+2C60-2C7F, U+A720-A7FF;
}
/* latin */
@font-face {
font-family: 'Noticia Text';
font-style: italic;
font-weight: 700;
src: local('Noticia Text Bold Italic'), local('NoticiaText-BoldItalic)'), url('/fonts/NoticiaText-BoldItalic.woff') format('woff');
unicode-range: U+0000-00FF, U+0131, U+0152-0153, U+02C6, U+02DA, U+02DC, U+2000-206F, U+2074, U+20AC, U+2212, U+2215, U+E0FF, U+EFFD, U+F000;
}
@font-face {
font-family: 'Symbola';
src: local('Symbola'), url('/fonts/Symbola.woff') format('woff');
}
body {
font-family: "Noticia Text", Symbola, serif;
font-family: "Palatino Linotype", "Book Antiqua", Palatino, serif;
font-size: 14pt;
color: #000;
background-color: #eed;
@@ -119,13 +17,13 @@ body {
}
textarea, pre, code, tt {
font-family: "Andale Mono", Monaco, "Courier New", Courier, monospace, Symbola;
font-size: 80%;
font-family: "Andale Mono", Monaco, "Courier New", Courier, monospace, Symbola;
font-size: 80%;
}
@media print {
body {
background-color: white;
background-color: white;
font-family: Times, serif;
font-size:10pt;
}
@@ -149,8 +47,8 @@ textarea, pre, code, tt {
.browse { min-height: 3em; }
.header form, .header p { margin: 0; }
/* hide the buttons but don't use display:none because of
http://stackoverflow.com/questions/5665203/getting-iphone-go-button-to-submit-form */
.header input[type="submit"] { position: absolute; visibility: hidden; }
http://stackoverflow.com/questions/5665203/getting-iphone-go-button-to-submit-form
.header input[type="submit"] { position: absolute; visibility: hidden; } */
.header input { width: 5em; font-size: 80%; }
.footer { clear:both; font-size: 90%; }
.content input { font-size: 80%; line-height: 125%; }
@@ -175,9 +73,9 @@ input#mail, input#homepage, input#username {
/* titles */
h1 {
font-weight: bold;
font-size: 150%;
padding: 1em 0;
font-weight: bold;
font-size: 150%;
padding: 1em 0;
}
h1 a:link, h1 a:visited {
color: inherit;
@@ -217,7 +115,7 @@ a:active {
border: 1px solid #9d8;
border-radius: 5px;
box-shadow: 0px 1px 3px white inset,
0px 1px 3px black;
0px 1px 3px black;
}
.button a {
text-decoration: none;
@@ -231,10 +129,6 @@ a:active {
font-weight: normal;
}
a.edit, div.footer, form, span.gotobar, a.number span { display:none; }
a[class="url number"]:after, a[class="inter number"]:after {
content:"[" attr(href) "]";
}
a[class="local number"]:after { content:"[" attr(title) "]"; }
img[smiley] { line-height: inherit; }
}
@@ -243,15 +137,15 @@ a.pencil { display: none; }
/* table of contents */
.toc {
font-size: smaller;
border-left: 1em solid #886;
font-size: smaller;
border-left: 1em solid #886;
}
.toc ol {
list-style-type: none;
padding-left: 1em;
list-style-type: none;
padding-left: 1em;
}
.toc a {
font-weight: normal;
font-weight: normal;
}
/* images with links, captions, etc */
@@ -307,26 +201,28 @@ div.message {
}
table.history { border-style:none; }
td.history { border-style:none; }
div.history span.dash + strong { font-weight: normal; }
span.result { font-size:larger; }
span.info { font-size:smaller; font-style:italic; }
div.rc hr { display: none; }
div.rc li { padding-bottom: 0.5em; }
div.rc li strong { font-weight: normal; }
/* Tables */
table.user {
margin: 1em 0;
padding: 0 1em;
border-top: 1px solid black;
border-bottom: 1px solid black;
margin: 1em 0;
padding: 0 1em;
border-top: 1px solid black;
border-bottom: 1px solid black;
}
div.aside table.user {
margin: 1em 0;
padding: 0;
margin: 1em 0;
padding: 0;
}
table.user td, table.user th {
border-style: none;
padding:5px 10px;
vertical-align: top;
border-style: none;
padding:5px 10px;
vertical-align: top;
}
table.user th { font-weight:bold; }
table.user td.r { text-align:right; }
@@ -337,7 +233,7 @@ table.user td.mark { background-color:yellow; }
tr:empty { display: block; height: 0.5em; }
@media print {
table {
font-size: 9pt;
font-size: 9pt;
margin: 0;
}
table.user td, table.user th {

View File

@@ -219,50 +219,3 @@ code {
background: #eee;
white-space: pre-wrap;
}
@font-face {
font-family: 'Gentium Basic';
font-style: normal;
font-weight: 700;
src: local('Gentium Basic Bold'), local('GentiumBasic-Bold'), url(/fonts/GenBasB.woff) format('woff');
}
@font-face {
font-family: 'Gentium Basic';
font-style: italic;
font-weight: 400;
src: local('Gentium Basic Italic'), local('GentiumBasic-Italic'), url(/fonts/GenBasI.woff) format('woff');
}
@font-face {
font-family: 'Gentium Basic';
font-style: italic;
font-weight: 700;
src: local('Gentium Basic Bold Italic'), local('GentiumBasic-BoldItalic'), url(/fonts/GenBasBI.woff) format('woff');
}
@font-face {
font-family: 'Gentium Basic';
font-style: normal;
font-weight: 400;
src: local('Gentium Basic'), local('GentiumBasic'), url(/fonts/GenBasR.woff) format('woff');
}
@font-face {
font-family: 'Gentium Plus';
font-style: normal;
font-weight: 400;
src: local('Gentium Plus'), local('GentiumPlus'), url(/fonts/GentiumPlus-R.woff) format('woff');
}
@font-face {
font-family: 'Gentium Plus';
font-style: italic;
font-weight: 400;
src: local('Gentium Plus Italic'), local('GentiumPlus-Italic'), url(/fonts/GentiumPlus-I.woff) format('woff');
}
@font-face {
font-family: 'Symbola';
src: local('Symbola'), url('/fonts/Symbola.woff') format('woff') url('/fonts/Symbola.ttf') format('truetype');
}

View File

@@ -1,43 +0,0 @@
# Copyright (C) 2004, 2005 Fletcher T. Penney <fletcher@freeshell.org>
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation; either version 2 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the
# Free Software Foundation, Inc.
# 59 Temple Place, Suite 330
# Boston, MA 02111-1307 USA
use strict;
use v5.10;
AddModuleDescription('aawrapperdiv.pl', 'WrapperDiv Module');
our ($q);
*OldGetHeader = \&GetHeader;
*GetHeader = \&WrapperGetHeader;
sub WrapperGetHeader {
my ($id, $title, $oldId, $nocache, $status) = @_;
my $result = OldGetHeader ($id, $title, $oldId, $nocache, $status);
$result .= $q->start_div({-class=>'wrapper'});
}
*OldPrintFooter = \&PrintFooter;
*PrintFooter = \&WrapperPrintFooter;
sub WrapperPrintFooter {
my ($id, $rev, $comment) = @_;
print $q->start_div({-class=>'wrapper close'});
print $q->end_div(), $q->end_div();
OldPrintFooter($id, $rev, $comment);
}

View File

@@ -22,7 +22,7 @@ our (@MyRules, $FreeLinkPattern);
push(@MyRules, \&LinksWithAccessKeys);
sub LinksWithAccessKeys {
if (m/\G(\[\[$FreeLinkPattern\{(.)\}\]\])/cog) {
if (m/\G(\[\[$FreeLinkPattern\{(.)\}\]\])/cg) {
my ($id, $key) = ($2, $3);
Dirty($1);
$id = FreeToNormal($id);

View File

@@ -36,7 +36,7 @@ sub AdminPowerDelete {
OpenPage($id);
my $status = DeletePage($id);
if ($status) {
print $q->p(GetPageLink($id) . ' ' . T('not deleted: ')) . $status;
print $q->p(GetPageLink($id) . ' ' . T('not deleted:') . ' ' . $status);
} else {
print $q->p(GetPageLink($id) . ' ' . T('deleted'));
WriteRcLog($id, Ts('Deleted %s', $id), 0, $Page{revision},
@@ -44,7 +44,7 @@ sub AdminPowerDelete {
GetCluster($Page{text}));
}
# Regenerate index on next request
unlink($IndexFile);
Unlink($IndexFile);
ReleaseLock();
print $q->p(T('Main lock released.'));
PrintFooter();
@@ -61,30 +61,30 @@ sub AdminPowerRename {
print $q->p(T('Main lock obtained.'));
# page file -- only check for existing or missing pages here
my $fname = GetPageFile($id);
ReportError(Ts('The page %s does not exist', $id), '400 BAD REQUEST') unless -f $fname;
ReportError(Ts('The page %s does not exist', $id), '400 BAD REQUEST') unless IsFile($fname);
my $newfname = GetPageFile($new);
ReportError(Ts('The page %s already exists', $new), '400 BAD REQUEST') if -f $newfname;
ReportError(Ts('The page %s already exists', $new), '400 BAD REQUEST') if IsFile($newfname);
# Regenerate index on next request -- remove this before errors can occur!
unlink($IndexFile);
Unlink($IndexFile);
# page file
CreateDir($PageDir); # It might not exist yet
rename($fname, $newfname)
Rename($fname, $newfname)
or ReportError(Tss('Cannot rename %1 to %2', $fname, $newfname) . ": $!", '500 INTERNAL SERVER ERROR');
# keep directory
my $kdir = GetKeepDir($id);
my $newkdir = GetKeepDir($new);
CreateDir($KeepDir); # It might not exist yet (only the parent directory!)
rename($kdir, $newkdir)
Rename($kdir, $newkdir)
or ReportError(Tss('Cannot rename %1 to %2', $kdir, $newkdir) . ": $!", '500 INTERNAL SERVER ERROR')
if -d $kdir;
if IsDir($kdir);
# refer file
if (defined(&GetRefererFile)) {
my $rdir = GetRefererFile($id);
my $newrdir = GetRefererFile($new);
CreateDir($RefererDir); # It might not exist yet
rename($rdir, $newrdir)
Rename($rdir, $newrdir)
or ReportError(Tss('Cannot rename %1 to %2', $rdir, $newrdir) . ": $!", '500 INTERNAL SERVER ERROR')
if -d $rdir;
if IsDir($rdir);
}
# RecentChanges
OpenPage($new);

View File

@@ -26,7 +26,7 @@ our ($q, $bol, %Action, %Page, $OpenPageName, $UseDiff, $UsePathInfo, $RssStyleS
push(@MyRules, \&AggregateRule);
sub AggregateRule {
if ($bol && m/\G(&lt;aggregate\s+((("[^\"&]+",?\s*)+)|(sort\s+)?search\s+(.+?))&gt;)/gc) {
if ($bol && m/\G(&lt;aggregate\s+((("[^\"&]+",?\s*)+)|(sort\s+)?search\s+(.+?))&gt;)/cg) {
Clean(CloseHtmlEnvironments());
Dirty($1);
my ($oldpos, $old_, $str, $sort, $search) = ((pos), $_, $3, $5, $6);
@@ -126,8 +126,8 @@ sub DoAggregate {
}
}
foreach my $id (@pages) {
my %data = ParseData(ReadFileOrDie(GetPageFile(FreeToNormal($id))));
my $page = $data{text};
my $data = ParseData(ReadFileOrDie(GetPageFile(FreeToNormal($id))));
my $page = $data->{text};
my $size = length($page);
my $i = index($page, "\n=");
my $j = index($page, "\n----");
@@ -136,13 +136,13 @@ sub DoAggregate {
$page =~ s/^=.*\n//; # if it starts with a header
my $name = $id;
$name =~ s/_/ /g;
my $date = TimeToRFC822($data{ts});
my $host = $data{host};
my $username = $data{username};
my $date = TimeToRFC822($data->{ts});
my $host = $data->{host};
my $username = $data->{username};
$username = QuoteHtml($username);
$username = $host unless $username;
my $minor = $data{minor};
my $revision = $data{revision};
my $minor = $data->{minor};
my $revision = $data->{revision};
my $cluster = GetCluster($page);
my $description = ToString(sub { ApplyRules(QuoteHtml($page), 1, 0, undef, 'p') });
$description .= $q->p(GetPageLink($id, T('Learn more...')))

View File

@@ -26,11 +26,11 @@ AddModuleDescription('agree-disagree.pl', 'AgreeDisagreePlugin');
push(@MyRules, \&AgreeDisagreeSupportRule);
push(@MyMacros, sub{ s/\[\+\]/"[+:" . GetParam('username', T('Anonymous'))
. ':' . TimeToText($Now) . "]"/ge });
push(@MyMacros, sub{ s/\[\+(:[^]:]+)\]/"[+$1:" . TimeToText($Now) . "]"/ge });
. ':' . TimeToText($Now) . "]"/eg });
push(@MyMacros, sub{ s/\[\+(:[^]:]+)\]/"[+$1:" . TimeToText($Now) . "]"/eg });
push(@MyMacros, sub{ s/\[\-\]/"[-:" . GetParam('username', T('Anonymous'))
. ':' . TimeToText($Now) . "]"/ge });
push(@MyMacros, sub{ s/\[\-(:[^]:]+)\]/"[-$1:" . TimeToText($Now) . "]"/ge });
. ':' . TimeToText($Now) . "]"/eg });
push(@MyMacros, sub{ s/\[\-(:[^]:]+)\]/"[-$1:" . TimeToText($Now) . "]"/eg });
$DefaultStyleSheet .= <<'EOT' unless $DefaultStyleSheet =~ /div\.agree/; # mod_perl?
@@ -78,17 +78,17 @@ EOT
sub AgreeDisagreeSupportRule {
if ($bol) {
if ($bol && m/(\G(\s*\[\+(.*?)\]|\s*\[-(.*?)\])+)/gcs) {
if ($bol && m/(\G(\s*\[\+(.*?)\]|\s*\[-(.*?)\])+)/cgs) {
my $votes = $1;
my @ayes = ();
my @nayes = ();
while ($votes =~ m/\G.*?\[\+(.*?)\]/gcs) {
while ($votes =~ m/\G.*?\[\+(.*?)\]/cgs) {
my ($ignore, $name, $time) = split(/:/, $1, 3);
push(@ayes, $name);
}
my $votes2 = $votes;
while ($votes2 =~ m/\G.*?\[-(.*?)\]/gcs) {
while ($votes2 =~ m/\G.*?\[-(.*?)\]/cgs) {
my ($ignore, $name, $time) = split(/:/, $1, 3);
push(@nayes, $name);
}

View File

@@ -21,13 +21,13 @@ our ($q, %Page, $FootnoteNumber, $FreeLinkPattern, @MyRules, $BracketWiki);
push(@MyRules, \&AnchorsRule);
sub AnchorsRule {
if (m/\G\[\[\#$FreeLinkPattern\]\]/gc) {
if (m/\G\[\[\#$FreeLinkPattern\]\]/cg) {
return $q->a({-href=>'#' . FreeToNormal($1), -class=>'local anchor'}, $1);
} elsif ($BracketWiki && m/\G\[\[\#$FreeLinkPattern\|([^\]]+)\]\]/gc) {
} elsif ($BracketWiki && m/\G\[\[\#$FreeLinkPattern\|([^\]]+)\]\]/cg) {
return $q->a({-href=>'#' . FreeToNormal($1), -class=>'local anchor'}, $2);
} elsif ($BracketWiki && m/\G(\[\[$FreeLinkPattern\#$FreeLinkPattern\|([^\]]+)\]\])/cog
or m/\G(\[\[\[$FreeLinkPattern\#$FreeLinkPattern\]\]\])/cog
or m/\G(\[\[$FreeLinkPattern\#$FreeLinkPattern\]\])/cog) {
} elsif ($BracketWiki && m/\G(\[\[$FreeLinkPattern\#$FreeLinkPattern\|([^\]]+)\]\])/cg
or m/\G(\[\[\[$FreeLinkPattern\#$FreeLinkPattern\]\]\])/cg
or m/\G(\[\[$FreeLinkPattern\#$FreeLinkPattern\]\])/cg) {
# This one is not a dirty rule because the output is always a page
# link, never an edit link (unlike normal free links).
my $bracket = (substr($1, 0, 3) eq '[[[');
@@ -47,7 +47,7 @@ sub AnchorsRule {
$text = $id unless $text;
$text =~ s/_/ /g;
return ScriptLink(UrlEncode($id), $text, $class, undef, $title);
} elsif (m/\G\[\:$FreeLinkPattern\]/gc) {
} elsif (m/\G\[\:$FreeLinkPattern\]/cg) {
return $q->a({-name=>FreeToNormal($1), -class=>'anchor'}, '');
}
return;

View File

@@ -34,7 +34,7 @@ push(@MyRules, \&MaskEmailRule);
sub MaskEmailRule {
# Allow [email@foo.bar Email Me] links
if (m/\G\[($EmailRegExp(\s\w+)*\s*)\]/igc) {
if (m/\G\[($EmailRegExp(\s\w+)*\s*)\]/cgi) {
my $chunk = $1;
$chunk =~ s/($EmailRegExp)//i;
my $email = $1;
@@ -51,7 +51,7 @@ sub MaskEmailRule {
return "<a href=\"mailto:$email\">$chunk</a>";
}
if (m/\G($EmailRegExp)/igc) {
if (m/\G($EmailRegExp)/cgi) {
my $email = $1;
if ($DoMaskEmail) {
my $masked="";

View File

@@ -20,7 +20,7 @@ AddModuleDescription('askpage.pl', 'Ask Page Extension');
use Fcntl qw(:DEFAULT :flock);
our ($DataDir);
our ($DataDir, %Translate, @MyFooters);
our ($AskPage, $QuestionPage, $NewQuestion);
# Don't forget to set your $CommentsPattern to include both $AskPage and $QuestionPage
$AskPage = 'Ask';
@@ -29,7 +29,7 @@ $NewQuestion = 'Write your question here:';
sub IncrementInFile {
my $filename = shift;
sysopen my $fh, $filename, O_RDWR|O_CREAT or die "can't open $filename: $!";
sysopen my $fh, encode_utf8($filename), O_RDWR|O_CREAT or die "can't open $filename: $!";
flock $fh, LOCK_EX or die "can't flock $filename: $!";
my $num = <$fh> || 1;
seek $fh, 0, 0 or die "can't rewind $filename: $!";
@@ -39,8 +39,8 @@ sub IncrementInFile {
return $num;
}
*OldAskPageDoPost=\&DoPost;
*DoPost=\&NewAskPageDoPost;
*OldAskPageDoPost = \&DoPost;
*DoPost = \&NewAskPageDoPost;
sub NewAskPageDoPost {
my $id = FreeToNormal(shift);
if ($id eq $AskPage and not GetParam('text', undef)) { # comment, not a regular edit
@@ -51,18 +51,18 @@ sub NewAskPageDoPost {
OldAskPageDoPost($id, @_); # keep original functionality for regular edits
}
*OldAskPageGetTextArea=\&GetTextArea;
*GetTextArea=\&NewAskPageGetTextArea;
sub NewAskPageGetTextArea {
my ($name, $text, @rest) = @_;
if ($name eq 'aftertext' and not $text and GetId() eq $AskPage) {
$text = $NewQuestion;
}
OldAskPageGetTextArea($name, $text, @rest);
*OldAskPageGetCommentForm = \&GetCommentForm;
*GetCommentForm = \&NewAskPageGetCommentForm;
@MyFooters = map { $_ == \&OldAskPageGetCommentForm ? \&NewAskPageGetCommentForm : $_ } @MyFooters;
sub NewAskPageGetCommentForm {
my ($id) = @_;
$Translate{'Add your comment here:'} = $NewQuestion if $id eq $AskPage;
OldAskPageGetCommentForm(@_);
}
*OldAskPageJournalSort=\&JournalSort;
*JournalSort=\&NewAskPageJournalSort;
*OldAskPageJournalSort = \&JournalSort;
*JournalSort = \&NewAskPageJournalSort;
sub NewAskPageJournalSort {
return OldAskPageJournalSort() unless $a =~ m/^$QuestionPage\d+$/ and $b =~ m/^$QuestionPage\d+$/;
($b =~ m/$QuestionPage(\d+)/)[0] <=> ($a =~ m/$QuestionPage(\d+)/)[0];

View File

@@ -16,6 +16,7 @@
use strict;
use v5.10;
use XML::Atom;
use XML::Atom::Entry;
use XML::Atom::Link;
use XML::Atom::Person;
@@ -149,7 +150,7 @@ sub GetRcAtom {
# Based on DoPost
sub DoAtomSave {
my ($type, $oldid) = @_;
my $entry = AtomEntry();
my $entry = AtomEntry($type);
my $title = $entry->title();
my $author = $entry->author();
SetParam('username', $author->name) if $author; # Used in Save()
@@ -230,15 +231,8 @@ sub DoAtomGet {
}
sub AtomEntry {
my $data = $q->param('POSTDATA');
if (not $data) {
# CGI provides POSTDATA for POST requests, not for PUT requests.
# The following code is based on the CGI->init code.
my $content_length = defined($ENV{'CONTENT_LENGTH'}) ? $ENV{'CONTENT_LENGTH'} : 0;
if ($content_length > 0 and $content_length < $MaxPost) {
$q->read_from_client(\$data, $content_length, 0);
}
}
my $type = shift || 'POST';
my $data = $q->param($type . 'DATA'); # PUTDATA or POSTDATA
my $entry = XML::Atom::Entry->new(\$data);
return $entry;
}

View File

@@ -171,11 +171,11 @@ sub UserCanEditAutoLockFix {
return 0 if $id eq 'SampleUndefinedPage' or $id eq T('SampleUndefinedPage')
or $id eq 'Sample_Undefined_Page' or $id eq T('Sample_Undefined_Page');
return 1 if UserIsAdmin() || UserIsEditor();
return 0 if $id ne '' and -f GetLockedPageFile($id);
return 0 if $LockOnCreation{$id} and not -f GetPageFile($id); # new page
return 0 if !$EditAllowed or -f $NoEditFile;
return 0 if $id ne '' and IsFile(GetLockedPageFile($id));
return 0 if $LockOnCreation{$id} and not IsFile(GetPageFile($id)); # new page
return 0 if !$EditAllowed or IsFile($NoEditFile);
return 0 if $editing and UserIsBanned(); # this call is more expensive
return 0 if $EditAllowed >= 2 and (not $CommentsPrefix or $id !~ /^$CommentsPrefix/o);
return 0 if $EditAllowed >= 2 and (not $CommentsPrefix or $id !~ /^$CommentsPrefix/);
return 1 if $EditAllowed >= 3 and ($comment or (GetParam('aftertext', '') and not GetParam('text', '')));
return 0 if $EditAllowed >= 3;
return 1;

View File

@@ -44,8 +44,8 @@ sub BacklinksMenu {
$Action{buildback} = \&BuildBacklinkDatabase;
sub BuildBacklinkDatabase {
print GetHttpHeader('text/plain');
unlink $backfile; # Remove old database
tie my %backhash, 'MLDBM', $backfile or die "Cannot open file $backfile $!\n";
Unlink($backfile); # Remove old database
tie my %backhash, 'MLDBM', encode_utf8($backfile) or die "Cannot open file $backfile $!\n";
log1("Starting Database Store Process ... please wait\n\n");
foreach my $name (AllPagesList()) {
@@ -101,7 +101,7 @@ sub GetBackLink {
our ($BacklinkBanned);
$BacklinkBanned = "HomePage|ScratchPad" if !$BacklinkBanned;
tie my %backhash, 'MLDBM', $backfile, O_CREAT|O_RDWR, oct(644) or die "Cannot open file $backfile $!\n";
tie my %backhash, 'MLDBM', encode_utf8($backfile), O_CREAT|O_RDWR, oct(644) or die "Cannot open file $backfile $!\n";
# Search database for matches
while ( my ($source, $hashes) = each %backhash ) {
@@ -117,7 +117,7 @@ sub GetBackLink {
foreach my $backlink (@backlinks) {
my ($class, $resolved, $title, $exists) = ResolveId($backlink);
if (($resolved ne $id) && ($resolved !~ /^($BacklinkBanned)$/)) {
push(@unpopped, ScriptLink(UrlEncode($resolved), $resolved, $class . ' backlink', undef, T('Internal Page: ' . $resolved)));
push(@unpopped, ScriptLink(UrlEncode($resolved), $resolved, $class . ' backlink', undef, Ts('Internal Page: %s', $resolved)));
}
}

View File

@@ -124,10 +124,10 @@ sub NewBanContributorsWriteRcLog {
and $OpenPageName eq $id and UserIsAdmin()) {
# we currently have the clean page loaded, so we need to reload
# the spammed revision (there is a possible race condition here)
my ($old) = GetTextRevision($Page{revision}-1, 1);
my %urls = map {$_ => 1 } $old =~ /$UrlPattern/og;
my $old = GetTextRevision($Page{revision} - 1, 1)->{text};
my %urls = map {$_ => 1 } $old =~ /$UrlPattern/g;
# we open the file again to force a load of the despammed page
foreach my $url ($Page{text} =~ /$UrlPattern/og) {
foreach my $url ($Page{text} =~ /$UrlPattern/g) {
delete($urls{$url});
}
# we also remove any candidates that are already banned
@@ -153,7 +153,7 @@ sub NewBanContributorsWriteRcLog {
$q->submit(T('Ban!'))),
$q->end_form();
};
print $q->p(T("Consider banning the IP number as well: "),
print $q->p(T("Consider banning the IP number as well:"), ' ',
ScriptLink('action=ban;id=' . UrlEncode($id), T('Ban contributors')));
};
return OldBanContributorsWriteRcLog(@_);

View File

@@ -61,12 +61,12 @@ sub bbCodeRule {
return AddHtmlEnvironment('strong', qq{class="highlight"}); }
elsif ($tag eq 'url') {
if ($option) {
$option =~ /^($UrlProtocols)/o;
$option =~ /^($UrlProtocols)/;
my $class = "url $1";
return AddHtmlEnvironment('a', qq{href="$option" class="$class"}); }
elsif (/\G$FullUrlPattern\s*\[\/url\]/cogi) {
elsif (/\G$FullUrlPattern\s*\[\/url\]/cgi) {
return GetUrl($1); }}
elsif ($tag eq 'img' and /\G$FullUrlPattern\s*\[\/img\]/cogi) {
elsif ($tag eq 'img' and /\G$FullUrlPattern\s*\[\/img\]/cgi) {
return GetUrl($1, undef, undef, 1); } # force image
elsif ($tag eq 'quote') {
my $html = CloseHtmlEnvironments();

View File

@@ -60,7 +60,6 @@ sub AddRecentVisitor {
my $url = ScriptUrl(join(';', "action=$action;id=" . UrlEncode($id),
map { $_ . '=' . UrlEncode(GetParam($_)) }
keys %params));
my $url = $q->url(-path_info=>1,-query=>1);
my $download = GetParam('action', 'browse') eq 'download'
|| GetParam('download', 0)
|| $q->path_info() =~ m/\/download\//;

View File

@@ -28,10 +28,10 @@ push(@MyRules, \&BlockQuoteRule);
sub BlockQuoteRule {
# indented text using : with the option of spanning multiple text
# paragraphs (but not lists etc).
if (InElement('blockquote') && m/\G(\s*\n)+:[ \t]*/cog) {
if (InElement('blockquote') && m/\G(\s*\n)+:[ \t]*/cg) {
return CloseHtmlEnvironmentUntil('blockquote')
. AddHtmlEnvironment('p');
} elsif ($bol && m/\G(\s*\n)*:[ \t]*/cog) {
} elsif ($bol && m/\G(\s*\n)*:[ \t]*/cg) {
return CloseHtmlEnvironments()
. AddHtmlEnvironment('blockquote')
. AddHtmlEnvironment('p');

View File

@@ -74,7 +74,7 @@ sub Cal {
$link .= ScriptLink('action=collect;match=' . UrlEncode($re), $day, 'local collection' . $class);
}
$link;
}}ge;
}}eg;
$cal =~ s{(\S+) (\d\d\d\d)}{{
my ($month_text, $year_text) = ($1, $2);
my $date = sprintf("%d-%02d", $year, $mon);
@@ -118,22 +118,22 @@ sub DoCollect {
push(@MyRules, \&CalendarRule);
sub CalendarRule {
if (/\G(calendar:(\d\d\d\d))/gc) {
if (/\G(calendar:(\d\d\d\d))/cg) {
my $oldpos = pos;
Clean(CloseHtmlEnvironments());
Dirty($1);
PrintYearCalendar($2);
pos = $oldpos;
return AddHtmlEnvironment('p');
} elsif (/\G(month:(\d\d\d\d)-(\d\d))/gc) {
} elsif (/\G(month:(\d\d\d\d)-(\d\d))/cg) {
my $oldpos = pos;
Clean(CloseHtmlEnvironments());
Dirty($1);
print Cal($2, $3);
pos = $oldpos;
return AddHtmlEnvironment('p');
} elsif (/\G(month:([+-]\d\d?))/gc
or /\G(\[\[month:([+-]\d\d?) $FreeLinkPattern\]\])/gc) {
} elsif (/\G(month:([+-]\d\d?))/cg
or /\G(\[\[month:([+-]\d\d?) $FreeLinkPattern\]\])/cg) {
my $oldpos = pos;
Clean(CloseHtmlEnvironments());
Dirty($1);

View File

@@ -77,7 +77,7 @@ sub DoCheckBox{
$summary{$3} = 0 if $2 eq 'x' or $2 eq 'X';
"${1}[[ :${3}]]";
}
}eig;
}egi;
SetParam('text', $text);
SetParam('summary', join(', ', map {
if ($summary{$_}) {

View File

@@ -62,7 +62,7 @@ foreach (@ClusterMapAdminPages){
}
sub ClusterMapRule {
if (/\G^([\n\r]*\<\s*clustermap\s*\>\s*)$/mgc) {
if (/\G^([\n\r]*\<\s*clustermap\s*\>\s*)$/cgm) {
Dirty($1);
my $oldpos = pos;
my $oldstr = $_;

View File

@@ -30,7 +30,7 @@ sub CommentDivWrapper {
return $q->start_div({-class=>'userComment'});
}
}
if ($OpenPageName =~ /$CommentsPattern/o) {
if ($OpenPageName =~ /$CommentsPattern/) {
if ($bol and m/\G(\s*\n)*----+[ \t]*\n?/cg) {
my $html = CloseHtmlEnvironments()
. ($CommentDiv++ > 0 ? $q->end_div() : $q->h2({-class=>'commentsHeading'}, T('Comments:'))) . $q->start_div({-class=>'userComment'})

View File

@@ -52,15 +52,15 @@ sub NewCommentcountScriptLink {
if ($CommentsPrefix && $action =~ /^$CommentsPrefix(.*)/) { # TODO use $CommentsPattern ?
# Add the number of comments here
my $id = $action;
$id =~ s/%([0-9a-f][0-9a-f])/chr(hex($1))/ge; # undo urlencode
$id =~ s/%([0-9a-f][0-9a-f])/chr(hex($1))/eg; # undo urlencode
my $comments = GetPageContent($id);
my $num = 0;
if($comments =~ /=== (\d+) Comments?\. ===/) {
$num = $1;
}
# Fix plurals
my $plural = T('Comments on ');
my $singular = T('Comment on ');
my $plural = T('Comments on');
my $singular = T('Comment on');
$text =~ s/$plural/$singular/ if($num == 1);
$text = $num . ' ' . $text;
}

View File

@@ -218,7 +218,7 @@ sub CreoleRule {
}
# escape next char (and prevent // in URLs from enabling italics)
# ~
elsif (m/\G(~($FullUrlPattern|\S))/cgo) {
elsif (m/\G(~($FullUrlPattern|\S))/cg) {
return
($CreoleTildeAlternative and
index( 'ABCDEFGHIJKLMNOPQRSTUVWXYZ'
@@ -234,12 +234,12 @@ sub CreoleRule {
# {{{preformatted code}}}
elsif (m/\G\{\{\{(.*?}*)\}\}\}/cg) { return $q->code($1); }
# download: {{pic}} and {{pic|text}}
elsif (m/\G(\{\{$FreeLinkPattern$CreoleLinkTextPattern\}\})/cgos) {
elsif (m/\G(\{\{$FreeLinkPattern$CreoleLinkTextPattern\}\})/cgs) {
my $text = $4 || $2;
return GetCreoleLinkHtml($1, GetDownloadLink(FreeToNormal($2), 1, undef, $text), $text);
}
# image link: {{url}} and {{url|text}}
elsif (m/\G\{\{$FullUrlPattern$CreoleLinkTextPattern\}\}/cgos) {
elsif (m/\G\{\{$FullUrlPattern$CreoleLinkTextPattern\}\}/cgs) {
return GetCreoleImageHtml(
$q->a({-href=> UnquoteHtml($1),
-class=> 'image outside'},
@@ -250,7 +250,7 @@ sub CreoleRule {
}
# image link: [[link|{{pic}}]] and [[link|{{pic|text}}]]
elsif (m/\G(\[\[$FreeLinkPattern$CreoleLinkPipePattern
\{\{$FreeLinkPattern$CreoleLinkTextPattern\}\}\]\])/cgosx) {
\{\{$FreeLinkPattern$CreoleLinkTextPattern\}\}\]\])/cgsx) {
my $text = $5 || $2;
return GetCreoleLinkHtml($1, GetCreoleImageHtml(
ScriptLink(UrlEncode(FreeToNormal($2)),
@@ -261,7 +261,7 @@ sub CreoleRule {
}
# image link: [[link|{{url}}]] and [[link|{{url|text}}]]
elsif (m/\G(\[\[$FreeLinkPattern$CreoleLinkPipePattern
\{\{$FullUrlPattern$CreoleLinkTextPattern\}\}\]\])/cgosx) {
\{\{$FullUrlPattern$CreoleLinkTextPattern\}\}\]\])/cgsx) {
my $text = $5 || $2;
return GetCreoleLinkHtml($1, GetCreoleImageHtml(
ScriptLink(UrlEncode(FreeToNormal($2)),
@@ -272,7 +272,7 @@ sub CreoleRule {
}
# image link: [[url|{{pic}}]] and [[url|{{pic|text}}]]
elsif (m/\G(\[\[$FullUrlPattern$CreoleLinkPipePattern
\{\{$FreeLinkPattern$CreoleLinkTextPattern\}\}\]\])/cgosx) {
\{\{$FreeLinkPattern$CreoleLinkTextPattern\}\}\]\])/cgsx) {
my $text = $5 || $2;
return GetCreoleLinkHtml($1, GetCreoleImageHtml(
$q->a({-href=> UnquoteHtml($2), -class=> 'image outside'},
@@ -283,7 +283,7 @@ sub CreoleRule {
}
# image link: [[url|{{url}}]] and [[url|{{url|text}}]]
elsif (m/\G\[\[$FullUrlPattern$CreoleLinkPipePattern
\{\{$FullUrlPattern$CreoleLinkTextPattern\}\}\]\]/cgosx) {
\{\{$FullUrlPattern$CreoleLinkTextPattern\}\}\]\]/cgsx) {
return GetCreoleImageHtml(
$q->a({-href=> UnquoteHtml($1), -class=> 'image outside'},
$q->img({-src=> UnquoteHtml($2),
@@ -292,7 +292,7 @@ sub CreoleRule {
-class=> 'url outside'})));
}
# link: [[url]] and [[url|text]]
elsif (m/\G\[\[$FullUrlPattern$CreoleLinkTextPattern\]\]/cgos) {
elsif (m/\G\[\[$FullUrlPattern$CreoleLinkTextPattern\]\]/cgs) {
# Permit embedding of Creole syntax within link text. (Rather complicated,
# but it does the job remarkably.)
my $link_url = $1;
@@ -305,7 +305,7 @@ sub CreoleRule {
return GetUrl($link_url, $link_text, 1);
}
# link: [[page]] and [[page|text]]
elsif (m/\G(\[\[$FreeLinkPattern$CreoleLinkTextPattern\]\])/cgos) {
elsif (m/\G(\[\[$FreeLinkPattern$CreoleLinkTextPattern\]\])/cgs) {
my $markup = $1;
my $page_name = $2;
my $link_text = $4 ? CreoleRuleRecursive($4, @_) : $page_name;
@@ -315,7 +315,7 @@ sub CreoleRule {
}
# interlink: [[Wiki:page]] and [[Wiki:page|text]]
elsif ($is_interlinking and
m/\G(\[\[$FreeInterLinkPattern$CreoleLinkTextPattern\]\])/cgos) {
m/\G(\[\[$FreeInterLinkPattern$CreoleLinkTextPattern\]\])/cgs) {
my $markup = $1;
my $interlink = $2;
my $interlink_text = $4;

View File

@@ -28,8 +28,8 @@ $RuleOrder{\&CrumbsRule} = -10; # run before default rules!
sub CrumbsRule {
if (not (pos) # first!
and (($WikiLinks && /\G($LinkPattern\n)/cgo)
or ($FreeLinks && /\G(\[\[$FreeLinkPattern\]\]\n)/cgo))) {
and (($WikiLinks && /\G($LinkPattern\n)/cg)
or ($FreeLinks && /\G(\[\[$FreeLinkPattern\]\]\n)/cg))) {
my $oldpos = pos; # will be trashed below
my $cluster = FreeToNormal($2);
my %seen = ($cluster => 1);

View File

@@ -133,14 +133,14 @@ sub DespamPage {
# from DoHistory()
my @revisions = sort {$b <=> $a} map { m|/([0-9]+).kp$|; $1; } GetKeepFiles($OpenPageName);
foreach my $revision (@revisions) {
my ($text, $rev) = GetTextRevision($revision, 1); # quiet
my ($revisionPage, $rev) = GetTextRevision($revision, 1); # quiet
if (not $rev) {
print ': ' . Ts('Cannot find revision %s.', $revision);
return;
} elsif (not DespamBannedContent($text)) {
} elsif (not DespamBannedContent($revisionPage->{text})) {
my $summary = Tss('Revert to revision %1: %2', $revision, $rule);
print ': ' . $summary;
Save($OpenPageName, $text, $summary) unless GetParam('debug', 0);
Save($OpenPageName, $revisionPage->{text}, $summary) unless GetParam('debug', 0);
return;
}
}

View File

@@ -77,8 +77,7 @@ sub DoUnifiedDiff { # copied from DoDiff
RequestLockDir('diff') or return '';
WriteStringToFile($oldName, $_[0]);
WriteStringToFile($newName, $_[1]);
my $diff_out = `diff -U 99999 -- \Q$oldName\E \Q$newName\E | tail -n +7`; # should be +4, but we always add extra line # TODO that workaround is ugly, fix it!
utf8::decode($diff_out); # needs decoding
my $diff_out = decode_utf8(`diff -U 99999 -- \Q$oldName\E \Q$newName\E | tail -n +7`); # should be +4, but we always add extra line # TODO that workaround is ugly, fix it!
$diff_out =~ s/\n\K\\ No newline.*\n//g; # Get rid of common complaint.
ReleaseLockDir('diff');
# No need to unlink temp files--next diff will just overwrite.

View File

@@ -30,7 +30,7 @@ $DojoTheme = 'tundra';
push (@MyRules, \&WysiwygRule);
sub WysiwygRule {
if (m/\G(&lt;.*?&gt;)/gc) {
if (m/\G(&lt;.*?&gt;)/cg) {
return $1 if substr($1,5,6) eq 'script'
or substr($1,4,6) eq 'script';
return UnquoteHtml($1);

View File

@@ -28,8 +28,8 @@ push( @MyRules, \&DownloadSupportRule );
# [[download:page name|alternate title]]
sub DownloadSupportRule {
if (m/\G(\[\[download:$FreeLinkPattern\|([^\]]+)\]\])/cog
or m!\G(\[\[download:$FreeLinkPattern\]\])!cog) {
if (m/\G(\[\[download:$FreeLinkPattern\|([^\]]+)\]\])/cg
or m!\G(\[\[download:$FreeLinkPattern\]\])!cg) {
Dirty($1);
print GetDownloadLink($2, undef, undef, $3);
return '';

View File

@@ -29,7 +29,7 @@ push(@MyInitVariables, \&DraftInit);
sub DraftInit {
if (GetParam('Draft', '')) {
SetParam('action', 'draft') ; # Draft button used
} elsif (-f "$DraftDir/" . GetParam('username', $q->remote_addr()) # draft exists
} elsif (IsFile("$DraftDir/" . GetParam('username', $q->remote_addr())) # draft exists
and $FooterNote !~ /action=draft/) { # take care of mod_perl persistence
$FooterNote = $q->p(ScriptLink('action=draft', T('Recover Draft'))) . $FooterNote;
}
@@ -47,11 +47,11 @@ sub DoDraft {
WriteStringToFile($draft, EncodePage(text=>$text, id=>$id));
SetParam('msg', T('Draft saved')); # invalidate cache
print GetHttpHeader('', T('Draft saved'), '204 NO CONTENT');
} elsif (-f $draft) {
my %data = ParseData(ReadFileOrDie($draft));
unlink ($draft);
} elsif (IsFile($draft)) {
my $data = ParseData(ReadFileOrDie($draft));
Unlink($draft);
$Message .= $q->p(T('Draft recovered'));
DoEdit($data{id}, $data{text}, 1);
DoEdit($data->{id}, $data->{text}, 1);
} else {
ReportError(T('No draft available to recover'), '404 NOT FOUND');
}
@@ -76,22 +76,19 @@ push(@MyMaintenance, \&DraftCleanup);
sub DraftFiles {
return map {
my $x = $_;
$x = substr($x, length($DraftDir) + 1);
utf8::decode($x);
$x;
} bsd_glob("$DraftDir/*"), bsd_glob("$DraftDir/.*");
substr($_, length($DraftDir) + 1);
} Glob("$DraftDir/*"), Glob("$DraftDir/.*");
}
sub DraftCleanup {
print '<p>' . T('Draft Cleanup');
foreach my $draft (DraftFiles()) {
next if $draft eq '.' or $draft eq '..';
my $ts = (stat("$DraftDir/$draft"))[9];
my $ts = Modified("$DraftDir/$draft");
if ($Now - $ts < 1209600) { # 14*24*60*60
print $q->br(), Tss("%1 was last modified %2 and was kept",
$draft, CalcTimeSince($Now - $ts));
} elsif (unlink("$DraftDir/$draft")) {
} elsif (Unlink("$DraftDir/$draft")) {
print $q->br(), Tss("%1 was last modified %2 and was deleted",
$draft, CalcTimeSince($Now - $ts));
} else {

View File

@@ -18,7 +18,7 @@ use v5.10;
AddModuleDescription('edit-cluster.pl', 'Edit Cluster Extension');
our ($q, $FS, $RcDefault, @RcDays, $RecentTop, $LastUpdate);
our ($q, $FS, $RcDefault, @RcDays, $RecentTop, $LastUpdate, $ShowAll);
our $EditCluster = 'EditCluster';
@@ -34,7 +34,7 @@ sub GetRc {
$changetime{$pagename} = $ts;
}
my $date = '';
my $all = GetParam('all', 0);
my $all = GetParam('all', $ShowAll);
my ($idOnly, $userOnly, $hostOnly, $clusterOnly, $filterOnly, $match, $lang) =
map { GetParam($_, ''); }
('rcidonly', 'rcuseronly', 'rchostonly', 'rcclusteronly',
@@ -128,7 +128,7 @@ sub EditClusterNewRcHeader {
$action = "action=rc$action";
}
my $days = GetParam('days', $RcDefault);
my $all = GetParam('all', 0);
my $all = GetParam('all', $ShowAll);
my @menu;
if ($all) {
push(@menu, ScriptLink("$action;days=$days;all=0",

View File

@@ -21,14 +21,14 @@ push(@MyRules, \&EmailQuoteRule);
sub EmailQuoteRule {
# > on a line of its own should work
if ($bol && m/\G(\s*\n)*((\&gt;))+\n/cog) {
if ($bol && m/\G(\s*\n)*((\&gt;))+\n/cg) {
return $q->p();
}
# > hi, you mentioned that:
# >> I don't like Oddmuse.
# > in last letter.
elsif ($bol && m/\G(\s*\n)*((\&gt;)+)[ \t]/cog
or InElement('dd') && m/\G(\s*\n)+((\&gt;)+)[ \t]/cog) {
elsif ($bol && m/\G(\s*\n)*((\&gt;)+)[ \t]/cg
or InElement('dd') && m/\G(\s*\n)+((\&gt;)+)[ \t]/cg) {
my $leng = length($2) / 4;
return CloseHtmlEnvironmentUntil('dd') . OpenHtmlEnvironment('dl',$leng, 'quote')
. $q->dt() . AddHtmlEnvironment('dd');

View File

@@ -28,7 +28,7 @@ push(@MyRules, \&EnclosureRule);
# [[enclosure:url|size in bytes|mime type]]
sub EnclosureRule {
if (m!\G\[\[enclosure:\s*$FreeLinkPattern(\|([^\]]+))?\]\]!ogci) {
if (m!\G\[\[enclosure:\s*$FreeLinkPattern(\|([^\]]+))?\]\]!cgi) {
my $id = FreeToNormal($1);
# Make sure we don't add duplicates; we will add non-existing
# enclosures as well. We test for existence only when the RSS feed
@@ -56,8 +56,8 @@ sub NewEnclosureRssItem {
my $id = shift;
my $rss = OldEnclosureRssItem($id, @_);
require MIME::Base64;
my %data = ParseData(ReadFileOrDie(GetPageFile($id)));
my @enclosures = split(' ', $data{enclosures});
my $data = ParseData(ReadFileOrDie(GetPageFile($id)));
my @enclosures = split(' ', $data->{enclosures});
my $enclosures = '';
foreach my $enclosure (@enclosures) {
# Don't add the enclosure if the page has been deleted in the mean

View File

@@ -38,10 +38,10 @@ $FaqAnswerText = "Answer: " unless $FaqAnswerText;
push(@MyRules, \&FaqRule);
sub FaqRule {
if ($bol && m/\GQ: (.+)/gc) {
if ($bol && m/\GQ: (.+)/cg) {
return $q->a({name=>'FAQ_' . UrlEncode($1)},'')
. $q->div({class=>'question'}, $FaqQuestionText . $1);
} elsif ($bol && m/\GA:[ \t]*/gc) {
} elsif ($bol && m/\GA:[ \t]*/cg) {
return CloseHtmlEnvironments()
. AddHtmlEnvironment('div', "class='answer'") . $FaqAnswerText;
}

View File

@@ -29,7 +29,7 @@ $FCKeditorHeight = 400; # Pixel
push (@MyRules, \&WysiwygRule);
sub WysiwygRule {
if (m/\G(&lt;.*?&gt;)/gc) {
if (m/\G(&lt;.*?&gt;)/cg) {
return $1 if substr($1,5,6) eq 'script'
or substr($1,4,6) eq 'script';
return UnquoteHtml($1);

View File

@@ -27,8 +27,7 @@ sub FixEncoding {
ValidIdOrDie($id);
RequestLockOrError();
OpenPage($id);
my $text = $Page{text};
utf8::decode($text);
my $text = decode_utf8($Page{text});
Save($id, $text, T('Fix character encoding'), 1) if $text ne $Page{text};
ReleaseLock();
ReBrowsePage($id);

View File

@@ -66,7 +66,7 @@ $RuleOrder{\&FlickrGalleryRule} = -10;
sub FlickrGalleryRule {
# This code is used when Markdown is not available
if (/\G^([\n\r]*\&lt;\s*FlickrSet:\s*(\d+)\s*\&gt;\s*)$/mgci) {
if (/\G^([\n\r]*\&lt;\s*FlickrSet:\s*(\d+)\s*\&gt;\s*)$/cgim) {
my $oldpos = pos;
my $oldstr = $_;
@@ -79,7 +79,7 @@ sub FlickrGalleryRule {
return '';
}
if (/\G^([\n\r]*\&lt;\s*FlickrPhoto:\s*(\d+)\s*([a-z0-9]*?)\s*($size)?\s*\&gt;\s*)$/mgci) {
if (/\G^([\n\r]*\&lt;\s*FlickrPhoto:\s*(\d+)\s*([a-z0-9]*?)\s*($size)?\s*\&gt;\s*)$/cgim) {
my $oldpos = pos;
my $oldstr = $_;
@@ -103,13 +103,13 @@ sub MarkdownFlickrGalleryRule {
^&lt;FlickrSet:\s*(\d+)\s*\>
}{
FlickrGallery($1);
}xmgei;
}egimx;
$text =~ s{
^&lt;FlickrPhoto:\s*(\d+)\s*([a-z0-9]*?)\s*($size)?\s*\>
}{
GetFlickrPhoto($1,$2,$3);
}xmgei;
}egimx;
return $text
}
@@ -135,7 +135,7 @@ sub FlickrGallery {
$result = $FlickrHeaderTemplate;
$result =~ s/(\$[a-zA-Z\d]+)/"defined $1 ? $1 : ''"/gee;
$result =~ s/(\$[a-zA-Z\d]+)/"defined $1 ? $1 : ''"/eeg;
# Get list of photos and process them
$url = $FlickrBaseUrl . "?method=flickr.photosets.getPhotos&api_key=" .
@@ -153,7 +153,7 @@ sub FlickrGallery {
my $footer = $FlickrFooterTemplate;
$footer =~ s/(\$[a-zA-Z\d]+)/"defined $1 ? $1 : ''"/gee;
$footer =~ s/(\$[a-zA-Z\d]+)/"defined $1 ? $1 : ''"/eeg;
$result .= $footer;
return $result;
@@ -192,7 +192,7 @@ sub FlickrPhoto {
my $output = $FlickrImageTemplate;
$output =~ s/(\$[a-zA-Z\d]+)/"defined $1 ? $1 : ''"/gee;
$output =~ s/(\$[a-zA-Z\d]+)/"defined $1 ? $1 : ''"/eeg;
return $output
}

View File

@@ -242,13 +242,13 @@ sub FootnotesRule {
# Footnotes and the set of all footnotes must be marked so as to ensure their
# reevaluation, as each of the footnotes might contain Wiki markup requiring
# reevaluation (like, say, free links).
if (m/\G($FootnotePattern)(?=([ \t]*$FootnotePattern)?)/gcos) {
if (m/\G($FootnotePattern)(?=([ \t]*$FootnotePattern)?)/cgs) {
Dirty($1); # do not cache the prefixing "\G"
my $footnote_text = $2;
my $is_adjacent_footnote = defined $3;
# A number range (e.g., "2-5") of references to other footnotes.
if ($footnote_text =~ m/^(\d+)-(\d+)$/o) {
if ($footnote_text =~ m/^(\d+)-(\d+)$/) {
my ($footnote_number_first, $footnote_number_last) = ($1, $2);
# '&#x2013;', below, is the HTML entity for a Unicode en-dash.
print $q->a({-href=> '#footnotes' .$footnote_number_first,
@@ -261,7 +261,7 @@ sub FootnotesRule {
}, $footnote_number_last.($is_adjacent_footnote ? ', ' : ''));
}
# A number (e.g., "5") implying reference to another footnote.
elsif ($footnote_text =~ m/^(\d+)$/o) {
elsif ($footnote_text =~ m/^(\d+)$/) {
my $footnote_number = $1;
print $q->a({-href=> '#footnotes' .$footnote_number,
-title=> 'Footnote #'.$footnote_number,
@@ -285,7 +285,7 @@ sub FootnotesRule {
return '';
}
# The "<footnotes>" list of all footnotes at the foot of a page.
elsif ($bol && m/\G($FootnotesPattern)/gcios) {
elsif ($bol && m/\G($FootnotesPattern)/cgis) {
Clean(CloseHtmlEnvironments());
Dirty($1); # do not cache the prefixing "\G"

View File

@@ -12,16 +12,16 @@ our ($q, $OpenPageName, @MyRules, $CrossbarPageName);
push(@MyRules, \&FormsRule);
sub FormsRule {
if (-f GetLockedPageFile($OpenPageName) or (InElement('div', '^class="crossbar"$') and
-f GetLockedPageFile($CrossbarPageName))) {
if (IsFile(GetLockedPageFile($OpenPageName)) or (InElement('div', '^class="crossbar"$') and
IsFile(GetLockedPageFile($CrossbarPageName)))) {
if (/\G(\&lt;form.*?\&lt;\/form\&gt;)/cgs) {
my $form = $1;
my $oldpos = pos;
Clean(CloseHtmlEnvironments());
Dirty($form);
$form =~ s/\%([a-z]+)\%/GetParam($1)/ge;
$form =~ s/\%([a-z]+)\%/GetParam($1)/eg;
$form =~ s/\$([a-z]+)\$/$q->span({-class=>'param'}, GetParam($1))
.$q->input({-type=>'hidden', -name=>$1, -value=>GetParam($1)})/ge;
.$q->input({-type=>'hidden', -name=>$1, -value=>GetParam($1)})/eg;
print UnquoteHtml($form);
pos = $oldpos;
return AddHtmlEnvironment('p');

View File

@@ -163,8 +163,7 @@ sub GdSecurityImageGenerate {
my ($imgData) = $img->out(force => 'png');
my $ticketId = Digest::MD5::md5_hex(rand());
CreateDir($GdSecurityImageDir);
my $file = GdSecurityImageGetImageFile($ticketId);
open my $fh, ">:raw", $file
open my $fh, ">:raw", encode_utf8(GdSecurityImageGetImageFile($ticketId))
or ReportError(Ts('Image storing failed. (%s)', $!), '500 INTERNAL SERVER ERROR');
print $fh $imgData;
#print $fh $png; ### experimental ###
@@ -187,9 +186,7 @@ sub GdSecurityImageIsValidId {
}
sub GdSecurityImageReadImageFile {
my $file = shift;
utf8::encode($file); # filenames are bytes!
if (open(my $IN, '<:raw', $file)) {
if (open(my $IN, '<:raw', encode_utf8(shift))) {
local $/ = undef; # Read complete files
my $data=<$IN>;
close $IN;
@@ -211,7 +208,7 @@ sub GdSecurityImageDoImage {
print $q->header(-type=>'image/png');
print $data;
unlink(GdSecurityImageGetImageFile($id));
Unlink(GdSecurityImageGetImageFile($id));
}
sub GdSecurityImageCleanup {
@@ -219,10 +216,10 @@ sub GdSecurityImageCleanup {
if (!GdSecurityImageIsValidId($id)) {
return;
}
my @files = (bsd_glob("$GdSecurityImageDir/*.png"), bsd_glob("$GdSecurityImageDir/*.ticket"));
my @files = (Glob("$GdSecurityImageDir/*.png"), Glob("$GdSecurityImageDir/*.ticket"));
foreach my $file (@files) {
if ($Now - (stat $file)[9] > $GdSecurityImageDuration) {
unlink($file);
if ($Now - Modified($file) > $GdSecurityImageDuration) {
Unlink($file);
}
}
}
@@ -240,9 +237,9 @@ sub GdSecurityImageCheck {
if ($answer ne '' && GdSecurityImageIsValidId($id)) {
my ($status, $data) = ReadFile(GdSecurityImageGetTicketFile($id));
if ($status) {
my %page = ParseData($data);
if ($page{generation_time} + $GdSecurityImageDuration > $Now) {
if ($answer eq $page{string}) {
my $page = ParseData($data);
if ($page->{generation_time} + $GdSecurityImageDuration > $Now) {
if ($answer eq $page->{string}) {
$GdSecurityImageId = '';
if (!$GdSecurityImageRememberAnswer) {
SetParam('gd_security_image_id', '');
@@ -255,7 +252,7 @@ sub GdSecurityImageCheck {
}
if (GdSecurityImageIsValidId($id)) {
unlink(GdSecurityImageGetTicketFile($id));
Unlink(GdSecurityImageGetTicketFile($id));
}
$GdSecurityImageId = GdSecurityImageGenerate();

View File

@@ -30,18 +30,18 @@ $GitMail = 'unknown@oddmuse.org';
sub GitCommit {
my ($message, $author) = @_;
my $oldDir = cwd;
chdir("$DataDir/page");
ChangeDir("$DataDir/page");
capture {
system($GitBinary, qw(add -A));
system($GitBinary, qw(commit -q -m), $message, "--author=$author <$GitMail>");
};
chdir($oldDir);
ChangeDir($oldDir);
}
sub GitInitRepository {
return if -d "$DataDir/page/.git";
return if IsDir("$DataDir/page/.git");
capture {
system($GitBinary, qw(init -q --), "$DataDir/page");
system($GitBinary, qw(init -q --), encode_utf8("$DataDir/page"));
};
GitCommit('Initial import', 'Oddmuse');
}

View File

@@ -80,7 +80,7 @@ sub GitRun {
my $exitStatus;
# warn join(' ', $GitBinary, @_) . "\n";
chdir($GitRepo);
ChangeDir($GitRepo);
if ($GitDebug) {
# TODO use ToString here
# capture the output of the git comand in a temporary file
@@ -99,7 +99,7 @@ sub GitRun {
} else {
$exitStatus = system($GitBinary, @_);
}
chdir($oldDir);
ChangeDir($oldDir);
return $exitStatus;
}
@@ -108,7 +108,7 @@ sub GitInitVariables {
}
sub GitInitRepository {
return if -d "$GitRepo/.git";
return if IsDir("$GitRepo/.git");
my $exception = shift;
CreateDir($GitRepo);
GitRun(qw(init --quiet));
@@ -187,17 +187,16 @@ sub DoGitCleanup {
}
sub GitCleanup {
if (-d $GitRepo) {
if (IsDir($GitRepo)) {
print $q->p('Git cleanup starting');
AllPagesList();
# delete all the files including all the files starting with a dot
opendir(DIR, $GitRepo) or ReportError("cannot open directory $GitRepo: $!");
opendir(DIR, encode_utf8($GitRepo)) or ReportError("cannot open directory $GitRepo: $!");
foreach my $file (readdir(DIR)) {
my $name = $file;
utf8::decode($name); # filenames are bytes
my $name = decode_utf8($file);
next if $file eq '.git' or $file eq '.' or $file eq '..' or $IndexHash{$name};
print $q->p("Deleting left over file $name");
unlink "$GitRepo/$file" or ReportError("cannot delete $GitRepo/$name: $!");
Unlink("$GitRepo/$file") or ReportError("cannot delete $GitRepo/$name: $!");
}
closedir DIR;
# write all the files again, just to be sure

View File

@@ -56,7 +56,7 @@ sub GooglePlusPrintFooter {
return q{
<!-- start of Google+ -->
<script type="text/javascript">
function loadScript(jssource,link_id) {
function loadScript(jssource) {
// add javascript
var jsnode = document.createElement('script');
jsnode.setAttribute('type','text/javascript');
@@ -66,15 +66,24 @@ function loadScript(jssource,link_id) {
var butn = document.createElement('div');
butn.setAttribute('class', 'g-plusone');
butn.setAttribute('id', 'my_plusone');
var link = document.getElementById(link_id);
var link = document.getElementById('plus1');
link.parentNode.insertBefore(butn, link);
// hide the link
link.innerHTML = "";
// when looking at action=plusone
var ul = document.getElementById('plus1s');
var children = ul.children;
for (var i = 0; i < children.length; i++) {
var li = children[i];
butn = document.createElement('g:plusone');
butn.setAttribute('href', li.firstElementChild.getAttribute('href'));
butn.setAttribute('id', 'my_plusone' + i);
li.appendChild(butn);
}
}
var plus1source = "https://apis.google.com/js/plusone.js";
</script>
<p id="plus1">
<a href="javascript:loadScript(plus1source,'plus1')">
<a href="javascript:loadScript('https://apis.google.com/js/plusone.js')">
<img src="/pics/plusone-h24.png" alt="Show Google +1" />
</a>
</p>
@@ -103,11 +112,9 @@ sub DoPlusOne {
push(@pages, $id) if $id =~ /^\d\d\d\d-\d\d-\d\d/;
}
splice(@pages, 0, $#pages - 19); # last 20 items
print "<ul>";
print '<ul id="plus1s">';
foreach my $id (@pages) {
my $url = ScriptUrl(UrlEncode($id));
print $q->li(GetPageLink($id),
qq{ <g:plusone href="$url"></g:plusone>});
print $q->li(GetPageLink($id), ' ');
}
print "</ul>";
print $q->end_div();

View File

@@ -39,7 +39,7 @@ sub GotobarInit {
@UserGotoBarPages = ();
$UserGotoBar = '';
my $count = 0;
while ($Page{text} =~ m/($LinkPattern|\[\[$FreeLinkPattern\]\]|\[\[$FreeLinkPattern\|([^\]]+)\]\]|\[$InterLinkPattern\s+([^\]]+?)\]|\[$FullUrlPattern[|[:space:]]([^\]]+?)\])/og) {
while ($Page{text} =~ m/($LinkPattern|\[\[$FreeLinkPattern\]\]|\[\[$FreeLinkPattern\|([^\]]+)\]\]|\[$InterLinkPattern\s+([^\]]+?)\]|\[$FullUrlPattern[|[:space:]]([^\]]+?)\])/g) {
my $page = $2||$3||$4||$6||$8;
my $text = $5||$7||$9;
$UserGotoBar .= ' ' if $UserGotoBar;

View File

@@ -29,7 +29,7 @@ my $gravatar_regexp = "\\[\\[gravatar:(?:$FullUrlPattern )?([^\n:]+):([0-9a-f]+)
push(@MyRules, \&GravatarRule);
sub GravatarRule {
if ($bol && m!\G$gravatar_regexp!cog) {
if ($bol && m!\G$gravatar_regexp!cg) {
my $url = $1;
my $gravatar = "https://secure.gravatar.com/avatar/$3";
my $name = FreeToNormal($2);
@@ -53,7 +53,7 @@ sub GravatarFormAddition {
return $html unless $type eq 'comment';
my $addition = $q->span({-class=>'mail'},
$q->label({-for=>'mail'}, T('Email: '))
$q->label({-for=>'mail'}, T('Email:') . ' ')
. ' ' . $q->textfield(-name=>'mail', -id=>'mail',
-default=>GetParam('mail', '')));
$html =~ s!(name="homepage".*?)</p>!$1 $addition</p>!i;
@@ -90,6 +90,6 @@ sub AddGravatar {
sub GravatarNewGetSummary {
my $summary = GravatarOldGetSummary(@_);
$summary =~ s/^$gravatar_regexp *//o;
$summary =~ s/^$gravatar_regexp *//;
return $summary;
}

View File

@@ -45,7 +45,7 @@ sub PrintGrep {
foreach my $id (AllPagesList()) {
my $text = GetPageContent($id);
next if (TextIsFile($text)); # skip files
while ($text =~ m{($regexp)}ig) {
while ($text =~ m{($regexp)}gi) {
print $q->li(GetPageLink($id) . ': ' . $1);
}
}

View File

@@ -43,7 +43,7 @@ sub HtmlTemplate {
my $type = shift;
# return header.de.html, or header.html, or error.html, or report an error...
foreach my $f ((map { "$type.$_" } HtmlTemplateLanguage()), $type, "error") {
return "$HtmlTemplateDir/$f.html" if -r "$HtmlTemplateDir/$f.html";
return "$HtmlTemplateDir/$f.html" if IsFile("$HtmlTemplateDir/$f.html");
}
ReportError(Tss('Could not find %1.html template in %2', $type, $HtmlTemplateDir),
'500 INTERNAL SERVER ERROR');

View File

@@ -41,7 +41,7 @@ $RuleOrder{\&HeadersRule} = 95;
sub HeadersRule {
my $oldpos = pos;
if ($bol && (m/\G((.+?)[ \t]*\n(---+|===+)[ \t]*\n)/gc)) {
if ($bol && (m/\G((.+?)[ \t]*\n(---+|===+)[ \t]*\n)/cg)) {
my $html = CloseHtmlEnvironments() . ($PortraitSupportColorDiv ? '</div>' : '');
if (substr($3,0,1) eq '=') {
$html .= $q->h2($2);

View File

@@ -32,7 +32,7 @@ push(@MyRules, \&HeadlinesRule);
$HeadlineNumber = 20;
sub HeadlinesRule {
if (m/\G(\&lt;headlines(:(\d+))?\&gt;)/gci) {
if (m/\G(\&lt;headlines(:(\d+))?\&gt;)/cgi) {
if (($3) and ($3>0)) {$HeadlineNumber = $3;};
Clean(CloseHtmlEnvironments());
Dirty($1);

View File

@@ -1212,7 +1212,7 @@ sub GetHibernalArchiveMonth {
~e;
$html_month =~ s~( {1,2})(\d{1,2})\b~
$1.GetHibernalArchiveMonthDay($post_name_regexp, $year, $month, $2)
~ge;
~eg;
# Float the HTML for each month horizontally past the month preceding it;
# failure to float months in this manner causes these months to stack

View File

@@ -87,7 +87,7 @@ sub GetActionHtmlTemplate {
my $action = GetParam('action', 'browse');
# return browse.de.html, or browse.html, or error.html, or report an error...
foreach my $f ((map { "$action.$_" } HtmlTemplateLanguage()), $action, "error") {
return "$HtmlTemplateDir/$f.html" if -r "$HtmlTemplateDir/$f.html";
return "$HtmlTemplateDir/$f.html" if IsFile("$HtmlTemplateDir/$f.html");
}
ReportError(Tss('Could not find %1.html template in %2', $action, $HtmlTemplateDir),
'500 INTERNAL SERVER ERROR');

View File

@@ -31,7 +31,7 @@ push(@MyRules, \&HtmlLinksRule);
$RuleOrder{\&HtmlLinksRule} = 105;
sub HtmlLinksRule {
if (-f GetLockedPageFile($OpenPageName)) {
if (IsFile(GetLockedPageFile($OpenPageName))) {
$HtmlLinks = 1;
} else {
$HtmlLinks = 0;

View File

@@ -32,7 +32,7 @@ push(@MyRules, \&ImageSupportRule);
sub ImageSupportRule {
my $result = undef;
if (m!\G\[\[image((/[a-z]+)*)( external)?:\s*([^]|]+?)\s*(\|[^]|]+?)?\s*(\|[^]|]*?)?\s*(\|[^]|]*?)?\s*(\|[^]|]*?)?\s*\]\](\{([^}]+)\})?!gc) {
if (m!\G\[\[image((/[a-z]+)*)( external)?:\s*([^]|]+?)\s*(\|[^]|]+?)?\s*(\|[^]|]*?)?\s*(\|[^]|]*?)?\s*(\|[^]|]*?)?\s*\]\](\{([^}]+)\})?!cg) {
my $oldpos = pos;
my $class = 'image' . $1;
my $external = $3;

View File

@@ -32,7 +32,7 @@ sub DivFooRule {
my $str = $1;
CreateDir($ImagifyDir);
my $fileName = sha256_hex($str) . '.' . $ImagifyFormat;
system('convert', %ImagifyParams, "caption:$str", "$ImagifyDir/$fileName") unless -e "$ImagifyDir/$fileName";
system('convert', %ImagifyParams, "caption:$str", "$ImagifyDir/$fileName") unless IsFile("$ImagifyDir/$fileName");
my $src = $ScriptName . "/imagify/" . UrlEncode($fileName);
return CloseHtmlEnvironments() . $q->img({-class => 'imagify', -src => $src, -alt => '(rendered text)'}) . AddHtmlEnvironment('p');
}

View File

@@ -33,7 +33,7 @@ $IrcLinkNick = 0;
# This adds an extra <br> at the beginning. Alternatively, add it to
# the last line, or only add it when required.
sub IrcRule {
if ($bol && m/\G(?:\[?(\d\d?:\d\d(?:am|pm)?)\]?)?\s*&lt;($IrcNickRegexp)&gt; ?/gc) {
if ($bol && m/\G(?:\[?(\d\d?:\d\d(?:am|pm)?)\]?)?\s*&lt;($IrcNickRegexp)&gt; ?/cg) {
my ($time, $nick) = ($1, $2);
my ($error) = ValidId($nick);
# if we're in a dl, close the open dd but not the dl. (if we're

View File

@@ -149,7 +149,7 @@ sub JoinerGetPasswordHash {
sub JoinerRequestLockOrError {
my ($name) = @_;
# 10 tries, 3 second wait, die on error
return RequestLockDir($name, 0, 10, 3, 1);
return RequestLockDir($name, 10, 3, 1);
}
sub JoinerGetEmailFile {
@@ -174,18 +174,17 @@ sub JoinerCreateAccount {
}
my ($email_status, $email_data) = ReadFile(JoinerGetEmailFile($email));
my %email_page = ();
if ($email_status) {
%email_page = ParseData($email_data);
if ($email_page{confirmed}) {
my $email_page = ParseData($email_data);
if ($email_page->{confirmed}) {
return Ts('The email address %s has already been used.', $email);
}
if ($email_page{registration_time} + $JoinerWait > $Now) {
my $min = 1 + int(($email_page{registration_time} + $JoinerWait - $Now) / 60);
if ($email_page->{registration_time} + $JoinerWait > $Now) {
my $min = 1 + int(($email_page->{registration_time} + $JoinerWait - $Now) / 60);
return Ts('Wait %s minutes before try again.', $min);
}
}
%email_page = ();
my %email_page = ();
$email_page{username} = $username;
$email_page{email} = $email;
$email_page{confirmed} = 0;
@@ -215,7 +214,7 @@ sub JoinerSendRegistrationConfirmationEmail {
print $EMAIL "From: $JoinerEmailSenderAddress\n";
print $EMAIL "Subject: $SiteName " . T('Registration Confirmation') . "\n";
print $EMAIL "\n";
print $EMAIL T('Visit the link blow to confirm registration.') . "\n";
print $EMAIL T('Visit the link below to confirm registration.') . "\n";
print $EMAIL "\n";
print $EMAIL "$link\n";
print $EMAIL "\n";
@@ -468,37 +467,37 @@ sub JoinerDoConfirmRegistration {
JoinerShowRegistrationConfirmationFailed();
return;
}
my %page = ParseData($data);
my $page = ParseData($data);
if ($key ne $page{key}) {
if ($key ne $page->{key}) {
$JoinerMessage = T('Invalid key.');
JoinerShowRegistrationConfirmationFailed();
return;
}
if ($page{registration_time} + $JoinerWait < $Now) {
if ($page->{registration_time} + $JoinerWait < $Now) {
$JoinerMessage = T('The key expired.');
JoinerShowRegistrationConfirmationFailed();
return;
}
$page{key} = '';
$page{confirmed} = 1;
$page->{key} = '';
$page->{confirmed} = 1;
JoinerRequestLockOrError('joiner');
CreateDir($JoinerDir);
WriteStringToFile(JoinerGetAccountFile($username), EncodePage(%page));
WriteStringToFile(JoinerGetAccountFile($username), EncodePage(%$page));
ReleaseLockDir('joiner');
my $email = $page{email};
my $email = $page->{email};
JoinerRequestLockOrError('joiner');
my ($email_status, $email_data) = ReadFile(JoinerGetEmailFile($email));
ReleaseLockDir('joiner');
if ($email_status) {
my %email_page = ParseData($email_data);
$email_page{confirmed} = 1;
my $email_page = ParseData($email_data);
$email_page->{confirmed} = 1;
JoinerRequestLockOrError('joiner');
CreateDir($JoinerEmailDir);
WriteStringToFile(JoinerGetEmailFile($email), EncodePage(%email_page));
WriteStringToFile(JoinerGetEmailFile($email), EncodePage(%$email_page));
ReleaseLockDir('joiner');
}
@@ -570,41 +569,41 @@ sub JoinerDoProcessLogin {
JoinerDoLogin();
return;
}
my %page = ParseData($data);
my $page = ParseData($data);
my $hash = JoinerGetPasswordHash($password);
if ($hash eq $page{password}) {
$page{recover} = 0;
if ($hash eq $page->{password}) {
$page->{recover} = 0;
SetParam('joiner_recover', 0);
} elsif ($key ne '' && $key eq $page{recover_key}) {
if ($page{recover_time} + $JoinerWait < $Now) {
} elsif ($key ne '' && $key eq $page->{recover_key}) {
if ($page->{recover_time} + $JoinerWait < $Now) {
$JoinerMessage = T('The key expired.');
JoinerDoLogin();
return;
}
$page{recover} = 1;
$page->{recover} = 1;
SetParam('joiner_recover', 1);
} else {
$JoinerMessage = T('Login failed.');
JoinerDoLogin();
return;
}
if ($page{banned}) {
if ($page->{banned}) {
$JoinerMessage = T('You are banned.');
JoinerDoLogin();
return;
}
if (!$page{confirmed}) {
if (!$page->{confirmed}) {
$JoinerMessage = T('You must confirm email address.');
JoinerDoLogin();
return;
}
my $session = Digest::MD5::md5_hex(rand());
$page{session} = $session;
$page->{session} = $session;
JoinerRequestLockOrError('joiner');
CreateDir($JoinerDir);
WriteStringToFile(JoinerGetAccountFile($username), EncodePage(%page));
WriteStringToFile(JoinerGetAccountFile($username), EncodePage(%$page));
ReleaseLockDir('joiner');
SetParam('username', $username);
@@ -617,7 +616,7 @@ sub JoinerDoProcessLogin {
print Ts('%s has logged in.', $username);
print $q->end_p();
if ($page{recover}) {
if ($page->{recover}) {
print $q->start_p();
print T('You should set new password immediately.');
print $q->end_p();
@@ -735,9 +734,9 @@ sub JoinerDoProcessChangePassword {
JoinerDoChangePassword();
return;
}
my %page = ParseData($data);
my $page = ParseData($data);
my $hash = JoinerGetPasswordHash($current_password);
if (!$page{recover} && $hash ne $page{password}) {
if (!$page->{recover} && $hash ne $page->{password}) {
$JoinerMessage = T('Current Password:') . ' ' . T('Password is wrong.');
JoinerDoChangePassword();
return;
@@ -754,12 +753,12 @@ sub JoinerDoProcessChangePassword {
return;
}
$page{password} = JoinerGetPasswordHash($new_password);
$page{key} = '';
$page{recover} = '';
$page->{password} = JoinerGetPasswordHash($new_password);
$page->{key} = '';
$page->{recover} = '';
JoinerRequestLockOrError('joiner');
CreateDir($JoinerDir);
WriteStringToFile(JoinerGetAccountFile($username), EncodePage(%page));
WriteStringToFile(JoinerGetAccountFile($username), EncodePage(%$page));
ReleaseLockDir('joiner');
SetParam('joiner_recover', 0);
@@ -823,9 +822,9 @@ sub JoinerDoProcessForgotPassword {
JoinerDoForgotPassword();
return;
}
my %email_page = ParseData($email_data);
my $email_page = ParseData($email_data);
my $username = $email_page{username};
my $username = $email_page->{username};
JoinerRequestLockOrError('joiner');
my ($status, $data) = ReadFile(JoinerGetAccountFile($username));
ReleaseLockDir('joiner');
@@ -834,27 +833,27 @@ sub JoinerDoProcessForgotPassword {
JoinerDoForgotPassword();
return;
}
my %page = ParseData($data);
my $page = ParseData($data);
if ($email ne $page{email}) {
if ($email ne $page->{email}) {
$JoinerMessage = T('The mail address is not valid anymore.');
JoinerDoForgotPassword();
return;
}
if ($page{recover_time} + $JoinerWait > $Now) {
my $min = 1 + int(($page{recover_time} + $JoinerWait - $Now) / 60);
if ($page->{recover_time} + $JoinerWait > $Now) {
my $min = 1 + int(($page->{recover_time} + $JoinerWait - $Now) / 60);
$JoinerMessage = Ts('Wait %s minutes before try again.', $min);
JoinerDoForgotPassword();
return;
}
my $key = Digest::MD5::md5_hex($JoinerGeneratorSalt . rand());
$page{recover_time} = $Now;
$page{recover_key} = $key;
$page->{recover_time} = $Now;
$page->{recover_key} = $key;
JoinerRequestLockOrError('joiner');
CreateDir($JoinerDir);
WriteStringToFile(JoinerGetAccountFile($username), EncodePage(%page));
WriteStringToFile(JoinerGetAccountFile($username), EncodePage(%$page));
ReleaseLockDir('joiner');
JoinerSendRecoverAccountEmail($email, $username, $key);
@@ -922,8 +921,8 @@ sub JoinerDoProcessChangeEmail {
my ($email_status, $email_data) = ReadFile(JoinerGetEmailFile($email));
ReleaseLockDir('joiner');
if ($email_status) {
my %email_page = ParseData($email_data);
if ($email_page{confirmed} && $email_page{username} ne $username) {
my $email_page = ParseData($email_data);
if ($email_page->{confirmed} && $email_page->{username} ne $username) {
$JoinerMessage = T('Email:') . ' ' .
Ts('The email address %s has already been used.', $email);
JoinerDoChangeEmail();
@@ -939,29 +938,29 @@ sub JoinerDoProcessChangeEmail {
JoinerDoChangeEmail();
return;
}
my %page = ParseData($data);
my $page = ParseData($data);
if ($page{change_email_time} + $JoinerWait > $Now) {
my $min = 1 + int(($page{change_email_time} + $JoinerWait - $Now) / 60);
if ($page->{change_email_time} + $JoinerWait > $Now) {
my $min = 1 + int(($page->{change_email_time} + $JoinerWait - $Now) / 60);
$JoinerMessage = Ts('Wait %s minutes before try again.', $min);
JoinerDoChangeEmail();
return;
}
my $hash = JoinerGetPasswordHash($password);
if ($hash ne $page{password}) {
if ($hash ne $page->{password}) {
$JoinerMessage = T('Password:') . ' ' . T('Password is wrong.');
JoinerDoChangeEmail();
return;
}
my $key = Digest::MD5::md5_hex(rand());
$page{change_email} = $email;
$page{change_email_key} = $key;
$page{change_email_time} = $Now;
$page->{change_email} = $email;
$page->{change_email_key} = $key;
$page->{change_email_time} = $Now;
JoinerRequestLockOrError('joiner');
CreateDir($JoinerDir);
WriteStringToFile(JoinerGetAccountFile($username), EncodePage(%page));
WriteStringToFile(JoinerGetAccountFile($username), EncodePage(%$page));
ReleaseLockDir('joiner');
JoinerSendChangeEmailEmail($email, $username, $key);
@@ -1012,22 +1011,22 @@ sub JoinerDoConfirmEmail {
JoinerShowConfirmEmailFailed();
return;
}
my %page = ParseData($data);
my $page = ParseData($data);
if ($key ne $page{change_email_key}) {
if ($key ne $page->{change_email_key}) {
$JoinerMessage = T('Invalid key.');
JoinerShowConfirmEmailFailed();
return;
}
my $new_email = $page{change_email};
$page{email} = $new_email;
$page{change_email} = '';
$page{change_email_key} = '';
$page{change_email_time} = '';
my $new_email = $page->{change_email};
$page->{email} = $new_email;
$page->{change_email} = '';
$page->{change_email_key} = '';
$page->{change_email_time} = '';
JoinerRequestLockOrError('joiner');
CreateDir($JoinerDir);
WriteStringToFile(JoinerGetAccountFile($username), EncodePage(%page));
WriteStringToFile(JoinerGetAccountFile($username), EncodePage(%$page));
ReleaseLockDir('joiner');
my %email_page = ();
@@ -1128,30 +1127,30 @@ sub JoinerDoProcessBan {
JoinerDoBan();
return;
}
my %page = ParseData($data);
my $page = ParseData($data);
if ($ban) {
if ($page{banned}) {
if ($page->{banned}) {
$JoinerMessage = Ts('%s is already banned.', $username);
JoinerDoBan();
return;
}
$page{banned} = 1;
$page{session} = '';
$page->{banned} = 1;
$page->{session} = '';
$JoinerMessage = Ts('%s has been banned.', $username);
} else {
if (!$page{banned}) {
if (!$page->{banned}) {
$JoinerMessage = Ts('%s is not banned.', $username);
JoinerDoBan();
return;
}
$page{banned} = 0;
$page->{banned} = 0;
$JoinerMessage = Ts('%s has been unbanned.', $username);
}
JoinerRequestLockOrError('joiner');
CreateDir($JoinerDir);
WriteStringToFile(JoinerGetAccountFile($username), EncodePage(%page));
WriteStringToFile(JoinerGetAccountFile($username), EncodePage(%$page));
ReleaseLockDir('joiner');
JoinerDoBan();
@@ -1178,16 +1177,16 @@ sub JoinerIsLoggedIn {
$JoinerLoggedIn = 0;
return $JoinerLoggedIn;
}
my %page = ParseData($data);
if (!$page{confirmed}) {
my $page = ParseData($data);
if (!$page->{confirmed}) {
$JoinerLoggedIn = 0;
return $JoinerLoggedIn;
}
if ($session ne $page{session}) {
if ($session ne $page->{session}) {
$JoinerLoggedIn = 0;
return $JoinerLoggedIn;
}
if ($page{banned}) {
if ($page->{banned}) {
$JoinerLoggedIn = 0;
return $JoinerLoggedIn;
}

View File

@@ -33,7 +33,7 @@ our ($q, @HtmlStack, @MyRules, $FullUrl);
push(@MyRules, \&LangRule);
sub LangRule {
if (m/\G\[([a-z][a-z])\]/gc) {
if (m/\G\[([a-z][a-z])\]/cg) {
my $html;
$html .= "</" . shift(@HtmlStack) . ">" if $HtmlStack[0] eq 'span';
return $html . AddHtmlEnvironment('span', "lang=\"$1\"") . "[$1]";

View File

@@ -98,7 +98,7 @@ EOT
push(@MyRules, \&LatexRule);
sub LatexRule {
if (m/\G\\\[(\(.*?\))?((.*\n)*?.*?)\\\]/gc) {
if (m/\G\\\[(\(.*?\))?((.*\n)*?.*?)\\\]/cg) {
my $label = $1;
my $latex = $2;
$label =~ s#\(?\)?##g;# Remove the ()'s from the label and convert case
@@ -106,13 +106,13 @@ sub LatexRule {
$eqCounter++;
$eqHash{$label} = $eqCounter;
return &MakeLaTeX("\\begin{displaymath} $latex \\end{displaymath}", "display math",$label);
} elsif (m/\G\$\$((.*\n)*?.*?)\$\$/gc) {
} elsif (m/\G\$\$((.*\n)*?.*?)\$\$/cg) {
return &MakeLaTeX("\$\$ $1 \$\$", $LatexSingleDollars ? "display math" : "inline math");
} elsif ($LatexSingleDollars and m/\G\$((.*\n)*?.*?)\$/gc) {
} elsif ($LatexSingleDollars and m/\G\$((.*\n)*?.*?)\$/cg) {
return &MakeLaTeX("\$ $1 \$", "inline math");
} elsif ($allowPlainLaTeX && m/\G\$\[((.*\n)*?.*?)\]\$/gc) { #Pick up plain LaTeX commands
} elsif ($allowPlainLaTeX && m/\G\$\[((.*\n)*?.*?)\]\$/cg) { #Pick up plain LaTeX commands
return &MakeLaTeX(" $1 ", "LaTeX");
} elsif (m/\GEQ\((.*?)\)/gc) { # Handle references to equations
} elsif (m/\GEQ\((.*?)\)/cg) { # Handle references to equations
my $label = $1;
$label =~ tr/A-Z/a-z/;
if ($eqHash{$label}) {
@@ -131,8 +131,8 @@ sub MakeLaTeX {
# Select which binary to use for conversion of dvi to images
my $useConvert = 0;
if (not -e $dvipngPath) {
if (not -e $convertPath) {
if (not IsFile($dvipngPath)) {
if (not IsFile($convertPath)) {
return "[Error: dvipng binary and convert binary not found at $dvipngPath or $convertPath ]";
}
else {
@@ -155,26 +155,26 @@ sub MakeLaTeX {
}
# check cache
if (not -f "$LatexDir/$hash.png" or -z "$LatexDir/$hash.png") { #If file doesn't exist or is zero bytes
if (not IsFile("$LatexDir/$hash.png") or ZeroSize("$LatexDir/$hash.png")) {
# Then create the image
# read template and replace <math>
mkdir($LatexDir) unless -d $LatexDir;
if (not -f $LatexDefaultTemplateName) {
open (my $F, '>', $LatexDefaultTemplateName) or return '[Unable to write template]';
CreateDir($LatexDir);
if (not IsFile($LatexDefaultTemplateName)) {
open (my $F, '>', encode_utf8($LatexDefaultTemplateName)) or return '[Unable to write template]';
print $F $LatexDefaultTemplate;
close $F;
}
my $template = ReadFileOrDie($LatexDefaultTemplateName);
$template =~ s/<math>/$latex/ig;
$template =~ s/<math>/$latex/gi;
#setup rendering directory
my $dir = "$LatexDir/$hash";
if (-d $dir) {
unlink (bsd_glob('$dir/*'));
if (IsDir($dir)) {
Unlink((Glob("$dir/*")));
} else {
mkdir($dir) or return "[Unable to create $dir]";
CreateDir($dir);
}
chdir ($dir) or return "[Unable to switch to $dir]";
ChangeDir($dir) or return "[Unable to switch to $dir]";
WriteStringToFile ("srender.tex", $template);
my $errorText = qx(latex srender.tex);
@@ -197,16 +197,16 @@ sub MakeLaTeX {
$error = "[dvipng error $? ($output)]" if $?;
}
if (not $error and -f 'srender1.png' and not -z 'srender1.png') {
if (not $error and IsFile('srender1.png') and not ZeroSize('srender1.png')) {
my $png = ReadFileOrDie("srender1.png");
WriteStringToFile ("$LatexDir/$hash.png", $png);
} else {
$error = "[Error retrieving image for $latex]";
}
}
unlink (glob('*'));
chdir ($LatexDir);
rmdir ($dir);
Unlink(glob('*'));
ChangeDir($LatexDir);
RemoveDir($dir);
return $error if $error;
}

View File

@@ -23,7 +23,7 @@ push(@MyRules, \&LinkAllRule);
$RuleOrder{\&LinkAllRule} = 1000;
sub LinkAllRule {
if (/\G([A-Za-z\x{0080}-\x{fffd}]+)/gc) {
if (/\G([A-Za-z\x{0080}-\x{fffd}]+)/cg) {
my $oldpos = pos;
Dirty($1);
# print the word, or the link to the word

View File

@@ -71,31 +71,31 @@ sub GetLinkList { # for the currently open page
my %links;
foreach my $block (@blocks) {
if (shift(@flags)) { # dirty block and interlinks or normal links
if ($inter and ($BracketText && $block =~ m/^(\[$InterLinkPattern\s+([^\]]+?)\])$/o
or $BracketText && $block =~ m/^(\[\[$FreeInterLinkPattern\|([^\]]+?)\]\])$/o
or $block =~ m/^(\[$InterLinkPattern\])$/o
or $block =~ m/^(\[\[\[$FreeInterLinkPattern\]\]\])$/o
or $block =~ m/^($InterLinkPattern)$/o
or $block =~ m/^(\[\[$FreeInterLinkPattern\]\])$/o)) {
if ($inter and ($BracketText && $block =~ m/^(\[$InterLinkPattern\s+([^\]]+?)\])$/
or $BracketText && $block =~ m/^(\[\[$FreeInterLinkPattern\|([^\]]+?)\]\])$/
or $block =~ m/^(\[$InterLinkPattern\])$/
or $block =~ m/^(\[\[\[$FreeInterLinkPattern\]\]\])$/
or $block =~ m/^($InterLinkPattern)$/
or $block =~ m/^(\[\[$FreeInterLinkPattern\]\])$/)) {
$links{$raw ? $2 : GetInterLink($2, $3)} = 1 if $InterSite{substr($2,0,index($2, ':'))};
} elsif ($link
and (($WikiLinks and $block !~ m/!$LinkPattern/o
and ($BracketWiki && $block =~ m/^(\[$LinkPattern\s+([^\]]+?)\])$/o
or $block =~ m/^(\[$LinkPattern\])$/o
or $block =~ m/^($LinkPattern)$/o))
and (($WikiLinks and $block !~ m/!$LinkPattern/
and ($BracketWiki && $block =~ m/^(\[$LinkPattern\s+([^\]]+?)\])$/
or $block =~ m/^(\[$LinkPattern\])$/
or $block =~ m/^($LinkPattern)$/))
or ($FreeLinks
and ($BracketWiki && $block =~ m/^(\[\[$FreeLinkPattern\|([^\]]+)\]\])$/o
or $block =~ m/^(\[\[\[$FreeLinkPattern\]\]\])$/o
or $block =~ m/^(\[\[$FreeLinkPattern\]\])$/o)))) {
and ($BracketWiki && $block =~ m/^(\[\[$FreeLinkPattern\|([^\]]+)\]\])$/
or $block =~ m/^(\[\[\[$FreeLinkPattern\]\]\])$/
or $block =~ m/^(\[\[$FreeLinkPattern\]\])$/)))) {
$links{$raw ? FreeToNormal($2) : GetPageOrEditLink($2, $3)} = 1;
} elsif ($url and $block =~ m/^\[$FullUrlPattern\]$/og) {
} elsif ($url and $block =~ m/^\[$FullUrlPattern\]$/g) {
$links{$raw ? $1 : GetUrl($1)} = 1;
}
} elsif ($url) { # clean block and url
while ($block =~ m/$UrlPattern/og) {
while ($block =~ m/$UrlPattern/g) {
$links{$raw ? $1 : GetUrl($1)} = 1;
}
while ($block =~ m/\[$FullUrlPattern\s+[^\]]+?\]/og) {
while ($block =~ m/\[$FullUrlPattern\s+[^\]]+?\]/g) {
$links{$raw ? $1 : GetUrl($1)} = 1;
}
}

View File

@@ -52,7 +52,7 @@ push (@MyRules, \&LinkTagRule, \&LinkDescriptionRule);
sub LinkTagRule { # Process link tags on a page
if ( m/\G$LinkTagMark(.*?)$LinkTagMark/gc) { # find tags
if ( m/\G$LinkTagMark(.*?)$LinkTagMark/cg) { # find tags
my @linktags = split /,\s*/, $1; # push them in array
@linktags = map { # and generate html output:
qq{<a href="$ScriptName?action=linktagsearch;linktag=$_">$_</a>}; # each tag is a link to search all links with that tag
@@ -66,7 +66,7 @@ sub LinkTagRule { # Process link tags on a page
sub LinkDescriptionRule { # Process link descriptions on a page
if ( m/\G$LinkDescMark(.*?)$LinkDescMark/gc) { # find description
if ( m/\G$LinkDescMark(.*?)$LinkDescMark/cg) { # find description
return qq{<span class="$LinkDescClass">$1</span>}; # put it in SPAN block
}
return;
@@ -184,7 +184,7 @@ sub PrintLinkTagMap {
my $tag = $1;
"<li id=\"$tag\">$tag</li>\n<ul>";
}xsge;
}egsx;
$result =~ s/\<\/tag\>/<\/ul>/g;
$result =~ s{
@@ -194,7 +194,7 @@ sub PrintLinkTagMap {
my $name = $2; if ( length $name == 0 ) { $name = $url; } # name (if not present use url instead)
my $description = $3; # and description
"<li><a href=\"$url\">$name</a> <span class=\"$LinkDescClass\">$description</span></li>";
}xsge;
}egsx;
print $result;
}

View File

@@ -39,7 +39,7 @@ sub DoListBannedContent {
print $BannedRegexps . ': ' . scalar(keys(%text_regexps)) . $q->br() . "\n";
PAGE: foreach my $id (@pages) {
OpenPage($id);
my @urls = $Page{text} =~ /$FullUrlPattern/go;
my @urls = $Page{text} =~ /$FullUrlPattern/g;
foreach my $url (@urls) {
foreach my $re (keys %url_regexps) {
if ($url =~ $re) {

View File

@@ -38,7 +38,7 @@ sub DoListLocked {
print $q->start_div({-class=>'content list locked'}), $q->start_p();
}
foreach my $id (AllPagesList()) {
PrintPage($id) if -f GetLockedPageFile($id);
PrintPage($id) if IsFile(GetLockedPageFile($id));
}
if (not $raw) {
print $q->end_p(), $q->end_div();

View File

@@ -31,7 +31,7 @@ $TagListLabel = "tag:";
push(@MyRules, \&ListTagRule);
sub ListTagRule {
if ($bol && /\G(\[\[\!tag\s*(.+)\]\])/gc) {
if ($bol && /\G(\[\[\!tag\s*(.+)\]\])/cg) {
my $tag_text = $2;
my @tags = split /,\s*/, $tag_text;
@tags = map {

View File

@@ -26,7 +26,7 @@ our ($q, $bol, @MyRules, $FreeLinkPattern);
push(@MyRules, \&LiveTemplateRule);
sub LiveTemplateRule {
if ($bol and /\G(&lt;&lt;$FreeLinkPattern\n)/cog) {
if ($bol and /\G(&lt;&lt;$FreeLinkPattern\n)/cg) {
Clean(CloseHtmlEnvironments());
my $str = $1;
my $template = FreeToNormal($2);
@@ -35,12 +35,12 @@ sub LiveTemplateRule {
Dirty($str);
my $oldpos = pos;
my $old_ = $_;
my %hash = ParseData($2);
my $hash = ParseData($2);
my $text = GetPageContent($template);
return $q->p($q->strong(Ts('The template %s is either empty or does not exist.',
$template))) . AddHtmlEnvironment('p') unless $text;
foreach my $key (keys %hash) {
$text =~ s/\$$key\$/$hash{$key}/g;
foreach my $key (keys %$hash) {
$text =~ s/\$$key\$/$hash->{$key}/g;
}
print "<div class=\"template $template\">";
ApplyRules(QuoteHtml($text), 1, 1, undef, 'p');

View File

@@ -18,32 +18,39 @@ use v5.10;
AddModuleDescription('load-lang.pl', 'Language Browser Preferences');
our ($q, %CookieParameters, $ConfigFile, $DataDir, $NamespaceCurrent, @MyInitVariables);
our ($CurrentLanguage, $LoadLanguageDir);
our ($q, %CookieParameters, $ConfigFile, $DataDir, $ModuleDir, $NamespaceCurrent, @MyInitVariables);
our $CurrentLanguage;
our $LoadLanguageDir = "$ModuleDir/translations"; # by default same as in git
$CookieParameters{interface} = '';
my %library= ('bg' => 'bulgarian-utf8.pl',
'de' => 'german-utf8.pl',
'es' => 'spanish-utf8.pl',
'fr' => 'french-utf8.pl',
'fi' => 'finnish-utf8.pl',
'gr' => 'greek-utf8.pl',
'he' => 'hebrew-utf8.pl',
'it' => 'italian-utf8.pl',
'ja' => 'japanese-utf8.pl',
'ko' => 'korean-utf8.pl',
'nl' => 'dutch-utf8.pl',
'pl' => 'polish-utf8.pl',
'pt' => 'portuguese-utf8.pl',
'ro' => 'romanian-utf8.pl',
'ru' => 'russian-utf8.pl',
'se' => 'swedish-utf8.pl',
'sr' => 'serbian-utf8.pl',
'zh' => 'chinese-utf8.pl',
'zh-cn' => 'chinese_cn-utf8.pl',
'zh-tw' => 'chinese-utf8.pl',
);
our %TranslationsLibrary = (
'bg' => 'bulgarian-utf8.pl',
'ca' => 'catalan-utf8.pl',
'de' => 'german-utf8.pl',
'et' => 'estonian-utf8.pl',
'es' => 'spanish-utf8.pl',
'fi' => 'finnish-utf8.pl',
'fr' => 'french-utf8.pl',
'gr' => 'greek-utf8.pl',
'he' => 'hebrew-utf8.pl',
'it' => 'italian-utf8.pl',
'ja' => 'japanese-utf8.pl',
'ko' => 'korean-utf8.pl',
'nl' => 'dutch-utf8.pl',
'pl' => 'polish-utf8.pl',
'pt' => 'portuguese-utf8.pl',
'pt-br' => 'brazilian-portuguese-utf8.pl',
'ro' => 'romanian-utf8.pl',
'ru' => 'russian-utf8.pl',
'se' => 'swedish-utf8.pl',
'sr' => 'serbian-utf8.pl',
'uk' => 'ukrainian-utf8.pl',
'zh' => 'chinese-utf8.pl',
'zh-cn' => 'chinese_cn-utf8.pl',
'zh-tw' => 'chinese-utf8.pl',
);
sub LoadLanguage {
# my $requested_language = "da, en-gb;q=0.8, en;q=0.7";
@@ -65,11 +72,12 @@ sub LoadLanguage {
# . $q->end_html) && exit if GetParam('debug', '');
foreach (@prefs) {
last if $Lang{$_} eq 'en'; # the default
my $file = $library{$Lang{$_}};
my $file = $TranslationsLibrary{$Lang{$_}};
next unless $file; # file is not listed, eg. there is no file for "de-ch"
$file = "$LoadLanguageDir/$file" if defined $LoadLanguageDir;
if (-r $file) {
if (IsFile($file)) {
do $file;
do "$ConfigFile-$Lang{$_}" if -r "$ConfigFile-$Lang{$_}";
do "$ConfigFile-$Lang{$_}" if IsFile("$ConfigFile-$Lang{$_}");
$CurrentLanguage = $Lang{$_};
last;
}

View File

@@ -64,9 +64,9 @@ You can change this expiry time by setting C<$LnCacheHours>.
push (@MyMaintenance, \&LnMaintenance);
sub LnMaintenance {
if (opendir(DIR, $RssDir)) { # cleanup if they should expire anyway
foreach (readdir(DIR)) {
unlink "$RssDir/$_" if $Now - (stat($_))[9] > $LnCacheHours * 3600;
if (opendir(DIR, encode_utf8($RssDir))) { # cleanup if they should expire anyway
foreach my $file (readdir(DIR)) {
Unlink("$RssDir/$file") if $Now - Modified($file) > $LnCacheHours * 3600;
}
closedir DIR;
}
@@ -103,7 +103,7 @@ L<http://ln.taoriver.net/>.
push(@MyRules, \&LocalNamesRule);
sub LocalNamesRule {
if (m/\G\[\[ln:$FullUrlPattern\s*([^\]]*)\]\]/cog) {
if (m/\G\[\[ln:$FullUrlPattern\s*([^\]]*)\]\]/cg) {
# [[ln:url text]], [[ln:url]]
return $q->a({-class=>'url outside ln', -href=>$1}, $2||$1);
}
@@ -144,7 +144,7 @@ sub LocalNamesInit {
$LocalNamesPage = FreeToNormal($LocalNamesPage); # spaces to underscores
$AdminPages{$LocalNamesPage} = 1;
my $data = GetPageContent($LocalNamesPage);
while ($data =~ m/\[$FullUrlPattern\s+([^\]]+?)\]/go) {
while ($data =~ m/\[$FullUrlPattern\s+([^\]]+?)\]/g) {
my ($page, $url) = ($2, $1);
my $id = FreeToNormal($page);
$LocalNames{$id} = $url;
@@ -152,12 +152,12 @@ sub LocalNamesInit {
# Now read data from ln links, checking cache if possible. For all
# URLs not in the cache or with invalid cache, fetch the file again,
# and save it in the cache.
my @ln = $data =~ m/\[\[ln:$FullUrlPattern[^\]]*?\]\]/go;
my @ln = $data =~ m/\[\[ln:$FullUrlPattern[^\]]*?\]\]/g;
my %todo = map {$_, GetLnFile($_)} @ln;
my %data = ();
if (GetParam('cache', $UseCache) > 0) {
foreach my $uri (keys %todo) { # read cached rss files if possible
if ($Now - (stat($todo{$uri}))[9] < $LnCacheHours * 3600) {
if ($Now - Modified($todo{$uri}) < $LnCacheHours * 3600) {
$data{$uri} = ReadFile($todo{$uri});
delete($todo{$uri}); # no need to fetch them below
}
@@ -347,13 +347,13 @@ sub DoLocalNames {
if (GetParam('expand', 0)) {
print "# Local names defined on $LocalNamesPage:\n";
my $data = GetPageContent($LocalNamesPage);
while ($data =~ m/\[$FullUrlPattern\s+([^\]]+?)\]/go) {
while ($data =~ m/\[$FullUrlPattern\s+([^\]]+?)\]/g) {
my ($title, $url) = ($2, $1);
my $id = FreeToNormal($title);
print qq{LN "$title" "$url"\n};
}
print "# Namespace delegations defined on $LocalNamesPage:\n";
while ($data =~ m/\[\[ln:$FullUrlPattern([^\]]*)?\]\]/go) {
while ($data =~ m/\[\[ln:$FullUrlPattern([^\]]*)?\]\]/g) {
my ($title, $url) = ($2, $1);
my $id = FreeToNormal($title);
print qq{NS "$title" "$url"\n};
@@ -396,10 +396,10 @@ sub DoDefine {
$q->start_div({-class=>'content define'}),
GetFormStart(undef, 'get', 'def');
my $go = T('Go!');
print $q->p($q->label({-for=>"defined"}, T('Name: ')),
print $q->p($q->label({-for=>"defined"}, T('Name:') . ' '),
$q->textfield(-name=>"name", -id=>"defined",
-tabindex=>"1", -size=>20));
print $q->p($q->label({-for=>"definition"}, T('URL: ')),
print $q->p($q->label({-for=>"definition"}, T('URL:') . ' '),
$q->textfield(-name=>"link", -id=>"definition",
-tabindex=>"2", -size=>20));
print $q->p($q->submit(-label=>$go, -tabindex=>"3"),
@@ -430,7 +430,7 @@ sub GetWantedPages {
# skip comment pages
if ($CommentsPrefix) {
foreach my $id (keys %WantedPages) {
delete $WantedPages{$id} if $id =~ /^$CommentsPrefix/o; # TODO use $CommentsPattern ?
delete $WantedPages{$id} if $id =~ /^$CommentsPrefix/; # TODO use $CommentsPattern ?
}
}
# now something more complicated: if near-links.pl was loaded, then
@@ -446,7 +446,7 @@ sub GetWantedPages {
# if any wanted pages remain, print them
if (@wanted) {
return $q->div({-class=>'definition'},
$q->p(T('Define external redirect: '),
$q->p(T('Define external redirect:'), ' ',
map { my $page = NormalToFree($_);
ScriptLink('action=define;name='
. UrlEncode($page),

View File

@@ -22,7 +22,7 @@ use v5.10;
AddModuleDescription('login.pl', 'Login Module');
our ($q, %Action, $SiteName, @MyAdminCode, $IndexFile, $DataDir, $FullUrl);
our ($RegistrationForm, $MinimumPasswordLength, $RegistrationsMustBeApproved, $LoginForm, $PasswordFile, $PasswordFileToUse, $PendingPasswordFile, $RequireLoginToEdit, $ConfirmEmailAddress, $UncomfirmedPasswordFile, $EmailSenderAddress, $EmailCommand, $EmailRegExp, $NotifyPendingRegistrations, $EmailConfirmationMessage, $ResetPasswordMessage, $LogoutForm, $ResetForm, $ChangePassForm, $RequireCamelUserName, $UsernameRegExp);
our ($RegistrationForm, $MinimumPasswordLength, $RegistrationsMustBeApproved, $LoginForm, $PasswordFile, $PasswordFileToUse, $PendingPasswordFile, $RequireLoginToEdit, $ConfirmEmailAddress, $UnconfirmedPasswordFile, $EmailSenderAddress, $EmailCommand, $EmailRegExp, $NotifyPendingRegistrations, $EmailConfirmationMessage, $ResetPasswordMessage, $LogoutForm, $ResetForm, $ChangePassForm, $RequireCamelUserName, $UsernameRegExp);
my $EncryptedPassword = "";
@@ -40,7 +40,7 @@ $RegistrationsMustBeApproved = 1 unless defined $RegistrationsMustBeApproved;
$PendingPasswordFile = "$DataDir/pending" unless defined $PendingPasswordFile;
$ConfirmEmailAddress = 1 unless defined $ConfirmEmailAddress;
$UncomfirmedPasswordFile = "$DataDir/uncomfirmed" unless defined $UncomfirmedPasswordFile;
$UnconfirmedPasswordFile = "$DataDir/uncomfirmed" unless defined $UnconfirmedPasswordFile;
$EmailSenderAddress = "fletcher\@freeshell.org" unless defined $EmailSenderAddress;
$EmailCommand = "/usr/sbin/sendmail -oi -t" unless defined $EmailCommand;
@@ -71,7 +71,7 @@ $PasswordFileToUse = $RegistrationsMustBeApproved
? $PendingPasswordFile : $PasswordFile;
$PasswordFileToUse = $ConfirmEmailAddress
? $UncomfirmedPasswordFile : $PasswordFileToUse;
? $UnconfirmedPasswordFile : $PasswordFileToUse;
$RegistrationForm = <<'EOT' unless defined $RegistrationForm;
<p>Your Username should be a CamelCase form of your real name, e.g. JohnDoe.</p>
@@ -199,9 +199,9 @@ sub DoRegister {
my $id = shift;
print GetHeader('', Ts('Register for %s', $SiteName), '');
print '<div class="content">';
$RegistrationForm =~ s/\%([a-z]+)\%/GetParam($1)/ige;
$RegistrationForm =~ s/\%([a-z]+)\%/GetParam($1)/egi;
$RegistrationForm =~ s/\$([a-z]+)\$/$q->span({-class=>'param'}, GetParam($1))
. $q->input({-type=>'hidden', -name=>$1, -value=>GetParam($1)})/ge;
. $q->input({-type=>'hidden', -name=>$1, -value=>GetParam($1)})/eg;
print $RegistrationForm;
print '</div>';
PrintFooter();
@@ -271,9 +271,9 @@ sub DoLogin {
my $id = shift;
print GetHeader('', Ts('Login to %s', $SiteName), '');
print '<div class="content">';
$LoginForm =~ s/\%([a-z]+)\%/GetParam($1)/ge;
$LoginForm =~ s/\%([a-z]+)\%/GetParam($1)/eg;
$LoginForm =~ s/\$([a-z]+)\$/$q->span({-class=>'param'}, GetParam($1))
. $q->input({-type=>'hidden', -name=>$1, -value=>GetParam($1)})/ge;
. $q->input({-type=>'hidden', -name=>$1, -value=>GetParam($1)})/eg;
print $LoginForm;
print '</div>';
PrintFooter();
@@ -290,7 +290,7 @@ sub DoProcessLogin {
ReportError(T('Username and/or password are incorrect.'))
unless (AuthenticateUser($username,$pwd));
unlink($IndexFile);
Unlink($IndexFile);
print GetHeader('', Ts('Register for %s', $SiteName), '');
print '<div class="content">';
print Ts('Logged in as %s.', $username);
@@ -305,9 +305,9 @@ sub DoLogout {
print GetHeader('', Ts('Logout of %s', $SiteName), '');
print '<div class="content">';
print '<p>' . Ts('Logout of %s?',$SiteName) . '</p>';
$LogoutForm =~ s/\%([a-z]+)\%/GetParam($1)/ge;
$LogoutForm =~ s/\%([a-z]+)\%/GetParam($1)/eg;
$LogoutForm =~ s/\$([a-z]+)\$/$q->span({-class=>'param'}, GetParam($1))
. $q->input({-type=>'hidden', -name=>$1, -value=>GetParam($1)})/ge;
. $q->input({-type=>'hidden', -name=>$1, -value=>GetParam($1)})/eg;
print $LogoutForm;
print '</div>';
PrintFooter();
@@ -318,7 +318,7 @@ $Action{process_logout} = \&DoProcessLogout;
sub DoProcessLogout {
SetParam('pwd','');
SetParam('username','');
unlink($IndexFile); # I shouldn't have to do this...
Unlink($IndexFile); # I shouldn't have to do this...
print GetHeader('', Ts('Logged out of %s', $SiteName), '');
print '<div class="content">';
print T('You are now logged out.');
@@ -328,7 +328,7 @@ sub DoProcessLogout {
sub UserExists {
my $username = shift;
if (open (my $PASSWD, '<', $PasswordFile)) {
if (open (my $PASSWD, '<', encode_utf8($PasswordFile))) {
while ( <$PASSWD> ) {
if ($_ =~ /^$username:/) {
return 1;
@@ -338,7 +338,7 @@ sub UserExists {
}
if ($RegistrationsMustBeApproved) {
if (open (my $PASSWD, '<', $PendingPasswordFile)) {
if (open (my $PASSWD, '<', encode_utf8($PendingPasswordFile))) {
while ( <$PASSWD> ) {
if ($_ =~ /^$username:/) {
return 1;
@@ -349,7 +349,7 @@ sub UserExists {
}
if ($ConfirmEmailAddress) {
if (open (my $PASSWD, '<', $UncomfirmedPasswordFile)) {
if (open (my $PASSWD, '<', encode_utf8($UnconfirmedPasswordFile))) {
while ( <$PASSWD> ) {
if ($_ =~ /^$username:/) {
return 1;
@@ -490,14 +490,13 @@ sub ConfirmUser {
my ($username, $key) = @_;
my $FileToUse = $RegistrationsMustBeApproved
? $PendingPasswordFile : $PasswordFileToUse;
if (open(my $PASSWD, '<', $UncomfirmedPasswordFile)) {
if (open(my $PASSWD, '<', encode_utf8($UnconfirmedPasswordFile))) {
while (<$PASSWD>) {
if ($_ =~ /^$username:(.*):(.*)/) {
if (crypt($1,$key) eq $key) {
AddUser($username,$1,$2,$FileToUse);
close $PASSWD;
RemoveUser($username,$UncomfirmedPasswordFile);
RemoveUser($username,$UnconfirmedPasswordFile);
if ($RegistrationsMustBeApproved) {
SendNotification($username);
}
@@ -515,8 +514,7 @@ sub RemoveUser {
my %passwords = ();
my %emails = ();
if (open (my $PASSWD, '<', $FileToUse)) {
if (open (my $PASSWD, '<', encode_utf8($FileToUse))) {
while ( <$PASSWD> ) {
if ($_ =~ /^(.*):(.*):(.*)$/) {
next if ($1 eq $username);
@@ -599,8 +597,7 @@ sub ChangePassword {
my %passwords = ();
my %emails = ();
if (open (my $PASSWD, '<', $PasswordFile)) {
if (open (my $PASSWD, '<', encode_utf8($PasswordFile))) {
while ( <$PASSWD> ) {
if ($_ =~ /^(.*):(.*):(.*)$/) {
$passwords{$1}=$2;
@@ -612,7 +609,7 @@ sub ChangePassword {
$passwords{$user} = $hash;
open (my $PASSWD, '>', $PasswordFile);
open (my $PASSWD, '>', encode_utf8($PasswordFile));
foreach my $key ( sort keys(%passwords)) {
print $PASSWD "$key:$passwords{$key}:$emails{$key}\n";
}
@@ -628,9 +625,9 @@ sub DoReset {
print GetHeader('', Ts('Reset Password for %s', $SiteName), '');
print '<div class="content">';
print '<p>' . T('Reset Password?') . '</p>';
$ResetForm =~ s/\%([a-z]+)\%/GetParam($1)/ge;
$ResetForm =~ s/\%([a-z]+)\%/GetParam($1)/eg;
$ResetForm =~ s/\$([a-z]+)\$/$q->span({-class=>'param'}, GetParam($1))
. $q->input({-type=>'hidden', -name=>$1, -value=>GetParam($1)})/ge;
. $q->input({-type=>'hidden', -name=>$1, -value=>GetParam($1)})/eg;
print $ResetForm;
print '</div>';
PrintFooter();
@@ -652,9 +649,9 @@ sub DoChangePassword {
print GetHeader('', Ts('Change Password for %s', $SiteName), '');
print '<div class="content">';
print '<p>' . T('Change Password?') . '</p>';
$ChangePassForm =~ s/\%([a-z]+)\%/GetParam($1)/ge;
$ChangePassForm =~ s/\%([a-z]+)\%/GetParam($1)/eg;
$ChangePassForm =~ s/\$([a-z]+)\$/$q->span({-class=>'param'}, GetParam($1))
. $q->input({-type=>'hidden', -name=>$1, -value=>GetParam($1)})/ge;
. $q->input({-type=>'hidden', -name=>$1, -value=>GetParam($1)})/eg;
print $ChangePassForm;
print '</div>';
PrintFooter();
@@ -719,7 +716,7 @@ sub DoApprovePending {
}
} else {
print '<ul>';
if (open(my $PASSWD, '<', $PendingPasswordFile)) {
if (open(my $PASSWD, '<', encode_utf8($PendingPasswordFile))) {
while (<$PASSWD>) {
if ($_ =~ /^(.*):(.*):(.*)$/) {
print '<li>' . ScriptLink("action=approve_pending;user=$1;",$1) . ' - ' . $3 . '</li>';
@@ -740,8 +737,7 @@ sub DoApprovePending {
sub ApproveUser {
my ($username) = @_;
if (open(my $PASSWD, '<', $PendingPasswordFile)) {
if (open(my $PASSWD, '<', encode_utf8($PendingPasswordFile))) {
while (<$PASSWD>) {
if ($_ =~ /^$username:(.*):(.*)/) {
AddUser($username,$1,$2,$PasswordFile);

View File

@@ -118,7 +118,7 @@ sub DoLogout {
print
GetHeader('', Ts('Logged out of %s', $SiteName), '') .
$q->div({-class=> 'content'}, $q->p(T('You are now logged out.'), $id ? $q->p(ScriptLink('action=browse;id=' . UrlEncode($id) . '&time=' . time, T('Return to ' . NormalToFree($id)))) : ''));
$q->div({-class=> 'content'}, $q->p(T('You are now logged out.'), $id ? $q->p(ScriptLink('action=browse;id=' . UrlEncode($id) . '&time=' . time, Ts('Return to %s', NormalToFree($id)))) : ''));
PrintFooter();
}
@@ -201,10 +201,10 @@ sub CookieUsernameFix {
# Only valid usernames get stored in the new cookie.
my $name = GetParam('username', '');
if (!$name) { }
elsif (!$FreeLinks && !($name =~ /^$LinkPattern$/o)) {
elsif (!$FreeLinks && !($name =~ /^$LinkPattern$/)) {
CookieUsernameFixDelete(Ts('Invalid UserName %s: not saved.', $name));
}
elsif ($FreeLinks && (!($name =~ /^$FreeLinkPattern$/o))) {
elsif ($FreeLinks && (!($name =~ /^$FreeLinkPattern$/))) {
CookieUsernameFixDelete(Ts('Invalid UserName %s: not saved.', $name));
}
elsif (length($name) > 50) { # Too long

View File

@@ -56,14 +56,12 @@ sub MacFixEncoding {
return unless %Namespaces;
my %hash = ();
for my $key (keys %Namespaces) {
utf8::decode($key);
$key = NFC($key);
$hash{$key} = $NamespaceRoot . '/' . $key . '/';
}
%Namespaces = %hash;
%hash = ();
for my $key (keys %InterSite) {
utf8::decode($key);
$key = NFC($key);
$hash{$key} = $Namespaces{$key} if $Namespaces{$key};
}

View File

@@ -84,7 +84,7 @@ sub MailNewInitCookie {
$q->delete('mail');
if (!$mail) {
# do nothing
} elsif (!($mail =~ /$MailPattern/o)) {
} elsif (!($mail =~ /$MailPattern/)) {
$Message .= $q->p(Ts('Invalid Mail %s: not saved.', $mail));
} else {
SetParam('mail', $mail);
@@ -106,7 +106,7 @@ sub MailFormAddition {
. ScriptLink("action=subscribe;pages=$id", T('subscribe'), 'subscribe');
}
$addition = $q->span({-class=>'mail'},
$q->label({-for=>'mail'}, T('Email: '))
$q->label({-for=>'mail'}, T('Email:') . ' ')
. ' ' . $q->textfield(-name=>'mail', -id=>'mail',
-default=>GetParam('mail', ''))
. $addition);
@@ -120,7 +120,7 @@ sub MailIsSubscribed {
return 0 unless $mail;
# open the DB file
require DB_File;
tie my %h, "DB_File", $MailFile;
tie my %h, "DB_File", encode_utf8($MailFile);
my %subscribers = map {$_=>1} split(/$FS/, UrlDecode($h{UrlEncode($id)}));
untie %h;
return $subscribers{$mail};
@@ -197,7 +197,7 @@ sub NewMailDeletePage {
sub MailDeletePage {
my $id = shift;
require DB_File;
tie my %h, "DB_File", $MailFile;
tie my %h, "DB_File", encode_utf8($MailFile);
foreach my $mail (split(/$FS/, UrlDecode(delete $h{UrlEncode($id)}))) {
my %subscriptions = map {$_=>1} split(/$FS/, UrlDecode($h{UrlEncode($mail)}));
delete $subscriptions{$id};
@@ -274,7 +274,7 @@ sub MailSubscription {
my $mail = shift;
return unless $mail;
require DB_File;
tie my %h, "DB_File", $MailFile;
tie my %h, "DB_File", encode_utf8($MailFile);
my @result = split(/$FS/, UrlDecode($h{UrlEncode($mail)}));
untie %h;
@result = sort @result;
@@ -303,16 +303,15 @@ sub DoMailSubscriptionList {
'<ul>';
}
require DB_File;
tie my %h, "DB_File", $MailFile;
tie my %h, "DB_File", encode_utf8($MailFile);
foreach my $encodedkey (sort keys %h) {
my @values = sort split(/$FS/, UrlDecode($h{$encodedkey}));
my $key = UrlDecode($encodedkey);
if ($raw) {
print join(' ', $key, @values) . "\n";
} else {
print $q->li(Ts('%s: ', MailLink($key, @values)),
join(' ', map { MailLink($_, $key) } @values));
print $q->li(Ts('%s:', MailLink($key, @values)) . ' '
. join(' ', map { MailLink($_, $key) } @values));
}
}
print '</ul></div>' unless $raw;
@@ -383,7 +382,7 @@ sub MailSubscribe {
return unless $mail and @pages;
# open the DB file
require DB_File;
tie my %h, "DB_File", $MailFile;
tie my %h, "DB_File", encode_utf8($MailFile);
# add to the mail entry
my %subscriptions = map {$_=>1} split(/$FS/, UrlDecode($h{UrlEncode($mail)}));
for my $id (@pages) {
@@ -442,7 +441,7 @@ sub MailUnsubscribe {
my ($mail, @pages) = @_;
return unless $mail and @pages;
require DB_File;
tie my %h, "DB_File", $MailFile;
tie my %h, "DB_File", encode_utf8($MailFile);
my %subscriptions = map {$_=>1} split(/$FS/, UrlDecode($h{UrlEncode($mail)}));
foreach my $id (@pages) {
delete $subscriptions{$id};
@@ -481,8 +480,7 @@ sub DoMailMigration {
$q->start_div({-class=>'content mailmigrate'});
require DB_File;
tie my %h, "DB_File", $MailFile;
tie my %h, "DB_File", encode_utf8($MailFile);
my $found = 0;
foreach my $key (keys %h) {
if (index($key, '@') != -1) {

View File

@@ -45,14 +45,14 @@ sub MarkdownRule {
. AddHtmlEnvironment("p");
}
# setext headers
elsif ($bol and m/\G((\s*\n)*(.+?)[ \t]*\n(-+|=+)[ \t]*\n)/gc) {
elsif ($bol and m/\G((\s*\n)*(.+?)[ \t]*\n(-+|=+)[ \t]*\n)/cg) {
return CloseHtmlEnvironments()
. (substr($4,0,1) eq '=' ? $q->h2($3) : $q->h3($3))
. AddHtmlEnvironment('p');
}
# > blockquote
# with continuation
elsif ($bol and m/\G&gt;/gc) {
elsif ($bol and m/\G&gt;/cg) {
return CloseHtmlEnvironments()
. AddHtmlEnvironment('blockquote');
}
@@ -117,20 +117,20 @@ sub MarkdownRule {
. AddHtmlEnvironment('td');
}
# whitespace indentation = code
elsif ($bol and m/\G(\s*\n)*( .+)\n?/gc) {
elsif ($bol and m/\G(\s*\n)*( .+)\n?/cg) {
my $str = substr($2, 4);
while (m/\G( .*)\n?/gc) {
while (m/\G( .*)\n?/cg) {
$str .= "\n" . substr($1, 4);
}
return OpenHtmlEnvironment('pre',1) . $str; # always level 1
}
# ``` = code
elsif ($bol and m/\G```[ \t]*\n(.*?)\n```[ \t]*(\n|$)/gcs) {
elsif ($bol and m/\G```[ \t]*\n(.*?)\n```[ \t]*(\n|$)/cgs) {
return CloseHtmlEnvironments() . $q->pre($1)
. AddHtmlEnvironment("p");
}
# [an example](http://example.com/ "Title")
elsif (m/\G\[(.+?)\]\($FullUrlPattern(\s+"(.+?)")?\)/goc) {
elsif (m/\G\[(.+?)\]\($FullUrlPattern(\s+"(.+?)")?\)/cg) {
my ($text, $url, $title) = ($1, $2, $4);
$url =~ /^($UrlProtocols)/;
my %params;

View File

@@ -142,18 +142,18 @@ sub MarkupTag {
}
sub MarkupRule {
if ($bol and %MarkupLines and m/$markup_lines_re/gc) {
if ($bol and %MarkupLines and m/$markup_lines_re/cg) {
my ($tag, $str) = ($1, $2);
$str = $q->span($tag) . $str;
while (m/$markup_lines_re/gc) {
while (m/$markup_lines_re/cg) {
$str .= $q->span($1) . $2;
}
return CloseHtmlEnvironments()
. MarkupTag($MarkupLines{UnquoteHtml($tag)}, $str)
. AddHtmlEnvironment('p');
} elsif (%MarkupSingles and m/$markup_singles_re/gc) {
} elsif (%MarkupSingles and m/$markup_singles_re/cg) {
return $MarkupSingles{UnquoteHtml($1)};
} elsif (%MarkupForcedPairs and m/$markup_forced_pairs_re/gc) {
} elsif (%MarkupForcedPairs and m/$markup_forced_pairs_re/cg) {
my $tag = $1;
my $start = $tag;
my $end = $tag;
@@ -168,20 +168,20 @@ sub MarkupRule {
$endre .= '[ \t]*\n?' if $block_element{$start}; # skip trailing whitespace if block
# may match the empty string, or multiple lines, but may not span
# paragraphs.
if ($endre and m/\G$endre/gc) {
if ($endre and m/\G$endre/cg) {
return $tag . $end;
} elsif ($tag eq $end && m/\G((:?.+?\n)*?.+?)$endre/gc) { # may not span paragraphs
} elsif ($tag eq $end && m/\G((:?.+?\n)*?.+?)$endre/cg) { # may not span paragraphs
return MarkupTag($data, $1);
} elsif ($tag ne $end && m/\G((:?.|\n)+?)$endre/gc) {
} elsif ($tag ne $end && m/\G((:?.|\n)+?)$endre/cg) {
return MarkupTag($data, $1);
} else {
return $tag;
}
} elsif (%MarkupPairs and m/$markup_pairs_re/gc) {
} elsif (%MarkupPairs and m/$markup_pairs_re/cg) {
return MarkupTag($MarkupPairs{UnquoteHtml($1)}, $2);
} elsif ($MarkupPairs{'/'} and m|\G~/|gc) {
} elsif ($MarkupPairs{'/'} and m|\G~/|cg) {
return '~/'; # fix ~/elisp/ example
} elsif ($MarkupPairs{'/'} and m|\G(/[-A-Za-z0-9\x{0080}-\x{fffd}/]+/$words/)|gc) {
} elsif ($MarkupPairs{'/'} and m|\G(/[-A-Za-z0-9\x{0080}-\x{fffd}/]+/$words/)|cg) {
return $1; # fix /usr/share/lib/! example
}
# "foo

View File

@@ -54,10 +54,10 @@ sub BisectAction {
sub BisectInitialScreen {
print GetFormStart(undef, 'get', 'bisect');
print GetHiddenValue('action', 'bisect');
my @disabledFiles = bsd_glob("$ModuleDir/*.p[ml].disabled");
my @disabledFiles = Glob("$ModuleDir/*.p[ml].disabled");
if (@disabledFiles == 0) {
print T('Test / Always enabled / Always disabled'), $q->br();
my @files = bsd_glob("$ModuleDir/*.p[ml]");
my @files = Glob("$ModuleDir/*.p[ml]");
for (my $i = 0; $i < @files; $i++) {
my $moduleName = fileparse($files[$i]);
my @disabled = ($moduleName eq 'module-bisect.pl' ? (-disabled=>'disabled') : ());
@@ -68,7 +68,7 @@ sub BisectInitialScreen {
}
print $q->submit(-name=>'bad', -value=>T('Start'));
} else {
print T('Biscecting proccess is already active.'), $q->br();
print T('Bisecting proccess is already active.'), $q->br();
print $q->submit(-name=>'stop', -value=>T('Stop'));
}
print $q->end_form();
@@ -78,7 +78,7 @@ sub BisectProcess {
my ($isGood) = @_;
my $parameterHandover = '';
BisectEnableAll();
my @files = bsd_glob("$ModuleDir/*.p[ml]");
my @files = Glob("$ModuleDir/*.p[ml]");
for (my $i = @files - 1; $i >= 0; $i--) { # handle user choices
if (GetParam("m$i") eq 'on') {
$parameterHandover .= GetHiddenValue("m$i", GetParam("m$i"));
@@ -102,7 +102,7 @@ sub BisectProcess {
print $q->end_form();
return;
}
print T('Module count (only testable modules): '), $q->strong(scalar @files), $q->br();
print T('Module count (only testable modules):'), ' ', $q->strong(scalar @files), $q->br();
print $q->br(), T('Current module statuses:'), $q->br();
my $halfsize = ($end - $start + 1) / 2.0; # + 1 because it is count
$end -= int($halfsize) unless $isGood;
@@ -131,7 +131,7 @@ sub BisectProcess {
}
sub BisectEnableAll {
for (bsd_glob("$ModuleDir/*.p[ml].disabled")) { # reenable all modules
for (Glob("$ModuleDir/*.p[ml].disabled")) { # reenable all modules
my $oldName = $_;
s/\.disabled$//;
print Ts('Enabling %s', (fileparse($_))[0]), '...', $q->br() if $_[0];

View File

@@ -40,8 +40,8 @@ sub ModuleUpdaterAction {
if (GetParam('ok')) {
ModuleUpdaterApply();
} else {
unlink bsd_glob("$TempDir/*.p[ml]"); # XXX is it correct to use $TempDir for such stuff? What if something else puts .pm files there?
for (bsd_glob("$ModuleDir/*.p[ml]")) {
Unlink(Glob("$TempDir/*.p[ml]")); # XXX is it correct to use $TempDir for such stuff? What if something else puts .pm files there?
for (Glob("$ModuleDir/*.p[ml]")) {
my $curModule = fileparse($_);
ProcessModule($curModule);
}
@@ -58,7 +58,7 @@ sub ModuleUpdaterAction {
}
sub ModuleUpdaterApply {
for (bsd_glob("$TempDir/*.p[ml]")) {
for (Glob("$TempDir/*.p[ml]")) {
my $moduleName = fileparse($_);
if (move($_, "$ModuleDir/$moduleName")) {
print $q->strong("Module $moduleName updated successfully!"), $q->br();
@@ -66,7 +66,7 @@ sub ModuleUpdaterApply {
print $q->strong("Unable to replace module $moduleName: $!"), $q->br();
}
}
unlink bsd_glob("$TempDir/*.p[ml]"); # XXX same as above
Unlink(Glob("$TempDir/*.p[ml]")); # XXX same as above
print $q->br(), $q->strong('Done!');
}
@@ -81,15 +81,14 @@ sub ProcessModule {
. ' If this is your own module, please contribute it to Oddmuse!'), $q->br();
return;
}
open my $fh, ">", "$TempDir/$module" or die("Could not open file. $!");
utf8::encode($moduleData);
open my $fh, ">:utf8", encode_utf8("$TempDir/$module") or die("Could not open file $TempDir/$module: $!");
print $fh $moduleData;
close $fh;
my $diff = DoModuleDiff("$ModuleDir/$module", "$TempDir/$module");
if (not $diff) {
print $q->strong('This module is up to date, there is no need to update it.'), $q->br();
unlink "$TempDir/$module";
Unlink("$TempDir/$module");
return;
}
print $q->strong('There is a newer version of this module. Here is a diff:'), $q->br();
@@ -109,7 +108,5 @@ sub ProcessModule {
}
sub DoModuleDiff {
my $diff = `diff -U 3 -- \Q$_[0]\E \Q$_[1]\E`;
utf8::decode($diff); # needs decoding
return $diff;
decode_utf8(`diff -U 3 -- \Q$_[0]\E \Q$_[1]\E`);
}

View File

@@ -64,13 +64,13 @@ sub MoinListLevel {
sub MoinRule {
# ["free link"]
if (m/\G(\["(.*?)"\])/gcs) {
if (m/\G(\["(.*?)"\])/cgs) {
Dirty($1);
print GetPageOrEditLink($2);
return '';
}
# [[BR]]
elsif (m/\G\[\[BR\]\]/gc) {
elsif (m/\G\[\[BR\]\]/cg) {
return $q->br();
}
# {{{

View File

@@ -46,13 +46,13 @@ sub NewMultiUrlBannedContent {
sub MultiUrlBannedContent {
my $str = shift;
my @urls = $str =~ /$FullUrlPattern/go;
my @urls = $str =~ /$FullUrlPattern/g;
my %domains;
my %whitelist;
my $max = 0;
my $label = '[a-z]([a-z0-9-]*[a-z0-9])?'; # RFC 1034
foreach (split(/\n/, GetPageContent($MultiUrlWhiteList))) {
next unless m/^\s*($label\.$label)/io;
next unless m/^\s*($label\.$label)/i;
$whitelist{$1} = 1;
}
foreach my $url (@urls) {

View File

@@ -85,9 +85,8 @@ sub NamespacesInitVariables {
# Do this before changing the $DataDir and $ScriptName
if ($UsePathInfo) {
$Namespaces{$NamespacesMain} = $ScriptName . '/';
foreach my $name (bsd_glob("$DataDir/*")) {
utf8::decode($name);
if (-d $name
foreach my $name (Glob("$DataDir/*")) {
if (IsDir($name)
and $name =~ m|/($InterSitePattern)$|
and $name ne $NamespacesMain
and $name ne $NamespacesSelf) {
@@ -99,8 +98,7 @@ sub NamespacesInitVariables {
$NamespaceCurrent = '';
my $ns = GetParam('ns', '');
if (not $ns and $UsePathInfo) {
my $path_info = $q->path_info();
utf8::decode($path_info);
my $path_info = decode_utf8($q->path_info());
# make sure ordinary page names are not matched!
if ($path_info =~ m|^/($InterSitePattern)(/.*)?|
and ($2 or $q->keywords or NamespaceRequiredByParameter())) {
@@ -137,13 +135,8 @@ sub NamespacesInitVariables {
$StaticUrl .= UrlEncode($NamespaceCurrent) . '/'
if substr($StaticUrl,-1) eq '/'; # from static-copy.pl
$WikiDescription .= "<p>Current namespace: $NamespaceCurrent</p>";
# override LastUpdate
my ($dev,$ino,$mode,$nlink,$uid,$gid,$rdev,$size, $atime,$mtime,$ctime,$blksize,$blocks)
= stat($IndexFile);
$LastUpdate = $mtime;
CreateDir($DataDir); # Create directory if it doesn't exist
ReportError(Ts('Cannot create %s', $DataDir) . ": $!", '500 INTERNAL SERVER ERROR')
unless -d $DataDir;
$LastUpdate = Modified($IndexFile);
CreateDir($DataDir);
}
$Namespaces{$NamespacesSelf} = $ScriptName . '?';
# reinitialize
@@ -224,19 +217,19 @@ sub NewNamespaceGetRcLines { # starttime, hash of seen pages to use as a second
# opening a rcfile, compare the first timestamp with the
# starttime. If any rcfile exists with no timestamp before the
# starttime, we need to open its rcoldfile.
foreach my $file (@rcfiles) {
open(my $F, '<:encoding(UTF-8)', $file);
foreach my $rcfile (@rcfiles) {
open(my $F, '<:encoding(UTF-8)', encode_utf8($rcfile));
my $line = <$F>;
my ($ts) = split(/$FS/o, $line); # the first timestamp in the regular rc file
my ($ts) = split(/$FS/, $line); # the first timestamp in the regular rc file
my @new;
if (not $ts or $ts > $starttime) { # we need to read the old rc file, too
push(@new, GetRcLinesFor($rcoldfiles{$file}, $starttime,\%match, \%following));
push(@new, GetRcLinesFor($rcoldfiles{$rcfile}, $starttime,\%match, \%following));
}
push(@new, GetRcLinesFor($file, $starttime, \%match, \%following));
push(@new, GetRcLinesFor($rcfile, $starttime, \%match, \%following));
# strip rollbacks in each namespace separately
@new = StripRollbacks(@new);
# prepend the namespace to both pagename and author
my $ns = $namespaces{$file};
my $ns = $namespaces{$rcfile};
if ($ns) {
for (my $i = $#new; $i >= 0; $i--) {
# page id
@@ -381,7 +374,8 @@ sub NewNamespaceBrowsePage {
#REDIRECT into different namespaces
my ($id, $raw, $comment, $status) = @_;
OpenPage($id);
my ($text, $revision) = GetTextRevision(GetParam('revision', ''), 1);
my ($revisionPage, $revision) = GetTextRevision(GetParam('revision', ''), 1);
my $text = $revisionPage->{text};
my $oldId = GetParam('oldid', '');
if (not $oldId and not $revision and (substr($text, 0, 10) eq '#REDIRECT ')
and (($WikiLinks and $text =~ /^\#REDIRECT\s+(($InterSitePattern:)?$InterLinkPattern)/)
@@ -440,8 +434,6 @@ sub NamespacesNewGetId {
# In this case GetId() will have set the parameter Test to 1.
# http://example.org/cgi-bin/wiki.pl/Test?rollback-1234=foo
# This doesn't set the Test parameter.
if ($UsePathInfo and $id eq $NamespaceCurrent and not GetParam($id) and not GetParam('ns')) {
$id = undef;
}
return if $id and $UsePathInfo and $id eq $NamespaceCurrent and not GetParam($id) and not GetParam('ns');
return $id;
}

View File

@@ -18,7 +18,7 @@ use v5.10;
AddModuleDescription('near-links.pl', 'Near Links');
our ($q, %AdminPages, %InterSite, $CommentsPrefix, $DataDir, $UseCache, @MyFooters, @MyMaintenance, @MyInitVariables, @Debugging, $InterSitePattern, @UserGotoBarPages, @IndexOptions);
our ($q, $Now, %AdminPages, %InterSite, $CommentsPrefix, $DataDir, $UseCache, @MyFooters, @MyMaintenance, @MyInitVariables, @Debugging, $InterSitePattern, @UserGotoBarPages, @IndexOptions);
=head1 Near Links
@@ -128,7 +128,8 @@ sub NearLinksMaintenance {
# skip if less than 12h old and caching allowed (the default)
foreach my $site (keys %NearSite) {
next if GetParam('cache', $UseCache) > 0
and -f "$NearDir/$site" and -M "$NearDir/$site" < 0.5;
and IsFile("$NearDir/$site")
and $Now - Modified("$NearDir/$site") < 0.5;
print $q->p(Ts('Getting page index file for %s.', $site));
my $data = GetRaw($NearSite{$site});
print $q->p($q->strong(Ts('%s returned no data, or LWP::UserAgent is not available.',
@@ -185,7 +186,7 @@ sub NewNearLinksResolveId {
my $id = shift;
my @result = OldNearLinksResolveId($id, @_);
my %forbidden = map { $_ => 1 } @UserGotoBarPages, %AdminPages;
$forbidden{$id} = 1 if $CommentsPrefix and $id =~ /^$CommentsPrefix/o;
$forbidden{$id} = 1 if $CommentsPrefix and $id =~ /^$CommentsPrefix/;
if (not $result[1] and $NearSource{$id} and not $forbidden{$id}) {
$NearLinksUsed{$id} = 1;
my $site = $NearSource{$id}[0];
@@ -264,18 +265,18 @@ sub SearchNearPages {
if (%NearSearch and GetParam('near', 1) > 1 and GetParam('context',1)) {
foreach my $site (keys %NearSearch) {
my $url = $NearSearch{$site};
$url =~ s/\%s/UrlEncode($string)/ge or $url .= UrlEncode($string);
$url =~ s/\%s/UrlEncode($string)/eg or $url .= UrlEncode($string);
print $q->hr(), $q->p(Ts('Fetching results from %s:', $q->a({-href=>$url}, $site)))
unless GetParam('raw', 0);
my $data = GetRaw($url);
my @entries = split(/\n\n+/, $data);
shift @entries; # skip head
foreach my $entry (@entries) {
my %entry = ParseData($entry); # need to pass reference
my $name = $entry{title};
my $entryPage = ParseData($entry); # need to pass reference
my $name = $entryPage->{title};
next if $found{$name}; # do not duplicate local pages
$found{$name} = 1;
PrintSearchResultEntry(\%entry, $regex); # with context and full search!
PrintSearchResultEntry($entryPage, $regex); # with context and full search!
}
}
}

View File

@@ -24,8 +24,8 @@ our ($q, @MyRules, $FullUrlPattern, $UrlProtocols, $BracketText);
push(@MyRules, \&NewWindowLink);
sub NewWindowLink {
# compare sub LinkRules in oddmuse.pl
if ($BracketText && m/\G(\[new:$FullUrlPattern\s+([^\]]+?)\])/cog
or m/\G(\[new:$FullUrlPattern\])/cog) {
if ($BracketText && m/\G(\[new:$FullUrlPattern\s+([^\]]+?)\])/cg
or m/\G(\[new:$FullUrlPattern\])/cg) {
my ($url, $text) = ($2, $3);
$url =~ /^($UrlProtocols)/;
my $class = "url $1"; # get protocol (http, ftp, ...)

View File

@@ -25,7 +25,7 @@ sub NewGetSearchLink {
my ($text, $class, $name, $title) = @_;
$name = UrlEncode($name);
$text =~ s/_/ /g;
return $q->span({-class=>$class }, $text);
return $q->span({-class=>$class}, $text);
}
push(@MyAdminCode, \&BacklinksMenu);
@@ -34,8 +34,8 @@ sub BacklinksMenu {
if ($id) {
my $text = T('Backlinks');
my $class = 'backlinks';
my $name = "backlinks";
my $title = T("Click to search for references to this page");
my $name = 'backlinks';
my $title = T('Click to search for references to this page');
my $link = ScriptLink('search=' . $id, $text, $class, $name, $title);
push(@$menuref, $link);
}

View File

@@ -35,7 +35,7 @@ $Action{clearcache} = \&DoClearCache;
sub DoClearCache {
print GetHeader('', QuoteHtml(T('Clearing Cache')), '');
unlink(bsd_glob("$NotFoundHandlerDir/*"));
Unlink(Glob("$NotFoundHandlerDir/*"));
print $q->p(T('Done.'));
PrintFooter();
}
@@ -45,7 +45,7 @@ sub DoClearCache {
sub ReadLinkDb {
return if $LinkDbInit;
$LinkDbInit = 1;
return if not -f $LinkFile;
return if not IsFile($LinkFile);
my $data = ReadFileOrDie($LinkFile);
map { my ($id, @links) = split; $LinkDb{$id} = \@links } split(/\n/, $data);
}
@@ -101,13 +101,13 @@ sub NewNotFoundHandlerSave {
my $id = $args[0];
OldNotFoundHandlerSave(@args);
RefreshLinkDb(); # for the open page
if (not -d $NotFoundHandlerDir) {
mkdir($NotFoundHandlerDir);
if (not IsDir($NotFoundHandlerDir)) {
CreateDir($NotFoundHandlerDir);
} elsif ($Page{revision} == 1) {
NotFoundHandlerCacheUpdate($id);
} else {
# unlink PageName, PageName.en, PageName.de, etc.
unlink("$NotFoundHandlerDir/$id", bsd_glob("$NotFoundHandlerDir/$id.[a-z][a-z]"));
Unlink("$NotFoundHandlerDir/$id", Glob("$NotFoundHandlerDir/$id.[a-z][a-z]"));
}
}
@@ -132,7 +132,7 @@ sub NotFoundHandlerCacheUpdate {
foreach my $source (keys %LinkDb) {
warn "Examining $source\n";
if (grep(/$target/, @{$LinkDb{$source}})) {
unlink("$NotFoundHandlerDir/$source", bsd_glob("$NotFoundHandlerDir/$source.[a-z][a-z]"));
Unlink("$NotFoundHandlerDir/$source", Glob("$NotFoundHandlerDir/$source.[a-z][a-z]"));
warn "Unlinking $source\n";
}
}

View File

@@ -25,8 +25,8 @@ push(@MyRules, \&NumberedListRule);
sub NumberedListRule {
# numbered lists using # copied from usemod.pl but allow leading
# whitespace
if ($bol && m/\G(\s*\n)*\s*(\#+)[ \t]/cog
or InElement('li') && m/\G(\s*\n)+\s*(\#+)[ \t]/cog) {
if ($bol && m/\G(\s*\n)*\s*(\#+)[ \t]/cg
or InElement('li') && m/\G(\s*\n)+\s*(\#+)[ \t]/cg) {
return CloseHtmlEnvironmentUntil('li')
. OpenHtmlEnvironment('ol',length($2))
. AddHtmlEnvironment('li');

View File

@@ -78,7 +78,7 @@ sub DoManifest {
# print ScriptUrl($id) . "\n" unless $IndexHash{$id};
# }
# External CSS
print $StyleSheet . "\n" if $StyleSheet;
print ref $StyleSheet ? join("\n", @$StyleSheet) . "\n" : "$StyleSheet\n" if $StyleSheet;
# FIXME: $StyleSheetPage
# FIXME: external images, stuff in $HtmlHeaders
# Error message all the stuff that's not available offline.

View File

@@ -99,9 +99,9 @@ sub LocalMapWorkHorse {
my $retval_children = '';
if ($depth > 0) {
my %data = ParseData(ReadFileOrDie(GetPageFile($id)));
my @flags = split(/$FS/, $data{'flags'});
my @blocks = split(/$FS/, $data{'blocks'});
my $data = ParseData(ReadFileOrDie(GetPageFile($id)));
my @flags = split(/$FS/, $data->{'flags'});
my @blocks = split(/$FS/, $data->{'blocks'});
my @subpages;
# Iterate over blocks, operate only on "dirty" ones
@@ -111,14 +111,14 @@ sub LocalMapWorkHorse {
local $_ = $blocks[$i];
if ($WikiLinks
&& ($BracketWiki && m/\G(\[$LinkPattern\s+([^\]]+?)\])/cog
or m/\G(\[$LinkPattern\])/cog or m/\G($LinkPattern)/cog)) {
&& ($BracketWiki && m/\G(\[$LinkPattern\s+([^\]]+?)\])/cg
or m/\G(\[$LinkPattern\])/cg or m/\G($LinkPattern)/cg)) {
$sub_id = $1;
} elsif ($FreeLinks
&& (($BracketWiki
&& m/\G(\[\[($FreeLinkPattern)\|([^\]]+)\]\])/cog)
or m/\G(\[\[\[($FreeLinkPattern)\]\]\])/cog
or m/\G(\[\[($FreeLinkPattern)\]\])/cog)) {
&& m/\G(\[\[($FreeLinkPattern)\|([^\]]+)\]\])/cg)
or m/\G(\[\[\[($FreeLinkPattern)\]\]\])/cg
or m/\G(\[\[($FreeLinkPattern)\]\])/cg)) {
$sub_id = $2;
}

View File

@@ -27,7 +27,7 @@ my $org_emph_re = qr!\G([ \t('\"])*(([*/_=+])([^ \t\r\n,*/_=+].*?(?:\n.*?){0,1}[
my %org_emphasis_alist = qw!* b / i _ u = code + del!;
sub OrgModeRule {
if (/$org_emph_re/cgo) {
if (/$org_emph_re/cg) {
my $tag = $org_emphasis_alist{$3};
return "$1<$tag>$4</$tag>$5";
}

View File

@@ -54,7 +54,7 @@ sub UpdatePageTrail {
sub NewPageTrailGetGotoBar {
my $bar = OldPageTrailGetGotoBar(@_);
$bar .= $q->span({-class=>'trail'}, $q->br(), T('Trail: '),
$bar .= $q->span({-class=>'trail'}, $q->br(), T('Trail:') . ' ',
map { GetPageLink($_) } reverse(@PageTrail))
if @PageTrail;
return $bar;

View File

@@ -28,7 +28,7 @@ push(@MyRules, \&ParagraphLinkRule);
$RuleOrder{\&ParagraphLinkRule} = 100;
sub ParagraphLinkRule {
if ($bol && m/\G(\[(-)?$FreeLinkPattern\])/cog) {
if ($bol && m/\G(\[(-)?$FreeLinkPattern\])/cg) {
Dirty($1);
my $invisible = $2;
my $orig = $3;

View File

@@ -27,7 +27,7 @@ our ($q, %Page, @MyRules, $CommentsPrefix);
push(@MyRules, \&PartialCutRule);
sub PartialCutRule {
if (m/\G(?<=\n)\s*--\s*cut\s*--\s*(?=\n)/gc) {
if (m/\G(?<=\n)\s*--\s*cut\s*--\s*(?=\n)/cg) {
return CloseHtmlEnvironments() . '<hr class="cut" />' . AddHtmlEnvironment('p');
}
return;

Some files were not shown because too many files have changed in this diff Show More