Compare commits

..

149 Commits

Author SHA1 Message Date
Aleks-Daniel Jakimenko-Aleksejev
ac69cc6612 Unicode icons
$UnicodeIcons option and a couple of initial icons.
2015-11-01 06:57:39 +02:00
Alex Schroeder
c64095fd95 scripts: renamed two scripts 2015-10-31 15:15:36 +01:00
Aleks-Daniel Jakimenko-Aleksejev
542f552002 nosearch.pl: code style
It seems like oddtrans does not pick up the strings with double quotes?
It's weird, but it does not matter anyway, because we will switch to gettext
sooner or later.
2015-10-31 02:58:00 +02:00
Aleks-Daniel Jakimenko-Aleksejev
22017a24f2 Updates to Russian translation (70% → 85%) 2015-10-31 02:56:39 +02:00
Alex Schroeder
6ac7093e9f word-count.pl: new 2015-10-30 23:39:52 +01:00
Aleks-Daniel Jakimenko-Aleksejev
89a23a6ac5 Full support for arrayref in $StyleSheet 2015-10-26 01:03:42 +02:00
Aleks-Daniel Jakimenko-Aleksejev
ed17476aeb Fix the number of tests in css.t 2015-10-26 00:34:57 +02:00
Aleks-Daniel Jakimenko-Aleksejev
1b951c66f1 Allow multiple stylesheet files in $StyleSheet
Since $StyleSheet is a scalar, you can't pass multiple values, but you can
now set it to one array ref. For example:

$StyleSheet = ['http://example.org/test.css', 'http://example.org/another.css'];
2015-10-26 00:27:14 +02:00
Aleks-Daniel Jakimenko-Aleksejev
878d99a84c New script translations-stats
Basically copied from the README file. Now you can use it instead of pasting
very long lines into your terminal.
2015-10-23 00:55:21 +03:00
Aleks-Daniel Jakimenko-Aleksejev
c5ec3d782c oddtrans: print # on the last line
Otherwise we will get odd number of elements if the last string has no
translation (it seems like perl trims last empty lines).

Also: strictures and formatting
2015-10-23 00:25:38 +03:00
Aleks-Daniel Jakimenko-Aleksejev
a28276b868 More fixes for Spanish translation 2015-10-21 16:57:27 +03:00
Aleks-Daniel Jakimenko-Aleksejev
a8920bfec1 Some fixes for Spanish translation 2015-10-21 16:06:13 +03:00
Aleks-Daniel Jakimenko-Aleksejev
0e5f338b40 Prioritize slow tests
By using 「--state=slow,save」 we can probably crunch all tests faster (better
wallclock time). Some tests are taking a lot of time simply because of the
delays (sleeping), so it makes sense to start these tests earlier.
2015-10-21 13:15:34 +03:00
Aleks-Daniel Jakimenko-Aleksejev
4a7e50e83e Fix tests ($ShowAll)
$ShowAll was not added to 「our」.
2015-10-21 12:15:25 +03:00
Aleks-Daniel Jakimenko-Aleksejev
608440553b Alexine repo is different now 2015-10-21 06:00:57 +03:00
Aleks-Daniel Jakimenko-Aleksejev
e493652e96 Alexine updates according to the new directory structure
Also, use 8 threads for 「make test」
2015-10-21 05:39:05 +03:00
Aleks-Daniel Jakimenko-Aleksejev
d286267d52 translations/README deleted
This file has to be regenerated periodically, Alexine will do that.
Also, these long one-liners have to be separated into scripts.
I am deleting this file because I'm not willing to update it.
2015-10-21 05:19:56 +03:00
Alex Schroeder
27c5c5fa79 Translation: fix 'non existent' for Spanish, too 2015-10-20 17:31:42 +02:00
Alex Schroeder
9f7cd0bfc7 Translation: 'non existant' to 'nonexisting' 2015-10-20 17:19:02 +02:00
Matias A. Fonzo
6a45d51189 Updates to Spanish translation
This is a general revision of the current file, and a massive upgrade
from the last date (2002).  Some sentences have been formalized and
have been clarified, including corrections, completing the total of the
translation.
2015-10-20 16:51:28 +02:00
Aleks-Daniel Jakimenko-Aleksejev
5d99cb5874 Typo in German translation (liefert repeated twice) 2015-10-19 23:21:44 +03:00
Aleks-Daniel Jakimenko-Aleksejev
7f0f8164bd Biscecting -> Bisection (typo) 2015-10-19 23:17:39 +03:00
Aleks-Daniel Jakimenko-Aleksejev
236e6a4c85 Updates to Russian translation 2015-10-19 23:12:58 +03:00
Aleks-Daniel Jakimenko-Aleksejev
669043e7a9 More fixes for $ShowAll
Some modules had to be fixed too!
'showedit' and 'rollback' are not used in modules, so there's nothing to fix.
2015-10-18 03:14:57 +03:00
Aleks-Daniel Jakimenko-Aleksejev
e57372692e “link blow” → “link below” 2015-10-18 02:48:06 +03:00
Aleks-Daniel Jakimenko-Aleksejev
512afd75a0 Fixed $ShowAll and $ShowEdits, added $ShowRollbacks
It seems like $ShowEdits feature was half broken (not all occurances were
actually defaulted to its value). In the last commit I did the same mistake
with $ShowAll. This is now fixed.
Also, for completeness, I decided to add $ShowRollbacks as well.
2015-10-18 01:21:07 +03:00
Aleks-Daniel Jakimenko-Aleksejev
b5c51d19ba New $ShowAll variable
Sometimes you might want to “List all changes” and “Include minor
changes” by default. We can already change the default value of the
latter by using $ShowEdits variable, but another one was unsettable from
the config file. Now we have $ShowAll variable.
2015-10-18 00:49:57 +03:00
Alex Schroeder
0955dcbc97 make translations
This adds the subheaders to all the translation files.

for f in modules/translations/*-utf8.pl; do
  perl -e "sub AddModuleDescription { print shift, ' ' };
           do '$f';
	   \$i = 0;
	   map { \$_ || \$i++} values %Translate;
	   printf(qq{%d/%d translations missing\n}, \$i, scalar keys %Translate);";
done

brazilian-portuguese-utf8.pl 237/672 translations missing
bulgarian-utf8.pl 496/672 translations missing
catalan-utf8.pl 259/672 translations missing
chinese-utf8.pl 254/672 translations missing
chinese_cn-utf8.pl 199/672 translations missing
dutch-utf8.pl 459/672 translations missing
finnish-utf8.pl 436/672 translations missing
french-utf8.pl 177/672 translations missing
german-utf8.pl 13/672 translations missing
greek-utf8.pl 242/672 translations missing
hebrew-utf8.pl 555/672 translations missing
italian-utf8.pl 425/672 translations missing
japanese-utf8.pl 5/237 translations missing
korean-utf8.pl 393/672 translations missing
fixme-utf8.pl 672/672 translations missing
polish-utf8.pl 239/672 translations missing
portuguese-utf8.pl 389/672 translations missing
romanian-utf8.pl 514/672 translations missing
russian-utf8.pl 297/672 translations missing
serbian-utf8.pl 526/672 translations missing
spanish-utf8.pl 246/672 translations missing
swedish-utf8.pl 372/672 translations missing
ukrainian-utf8.pl 347/672 translations missing
2015-10-17 23:00:26 +02:00
Alex Schroeder
5ee3adf13f Switched to Palatino
On a phone with little RAM, downloading Noticia takes too much time.
2015-10-16 08:44:17 +02:00
Alex Schroeder
5a237d05f7 Translation: "为 %s 建立锁定 。\t"
Removed trailing tab.
2015-10-15 20:12:19 +02:00
Alex Schroeder
bd5d419472 Translation: " . . . . "
Removed trailing whitespace.
2015-10-15 20:05:23 +02:00
Alex Schroeder
cdb66e1ed4 Translation: " ... " is no longer translatable 2015-10-15 19:35:08 +02:00
Alex Schroeder
d5cd6cbd65 Translations: removed trailing whitespace
Various translation files still had translated strings with trailing
whitespace where the English original did not have any. This has been
removed.
2015-10-15 19:32:12 +02:00
Alex Schroeder
e85ddcc9b9 Translation: "Comments on " and "Comment on"
Removed trailing whitespace.
2015-10-15 19:29:35 +02:00
Alex Schroeder
ac4948ca5d Translations: removed trailing whitespace
Various translation files still had translated strings with trailing
whitespace where the English original did not have any. This has been
removed.
2015-10-15 19:23:42 +02:00
Alex Schroeder
6bffdc8149 Trans.: "Consider banning the IP number as well: "
Remove trailing whitespace. In the German translation, I also replaced a
few instances of "sie" with "Sie".
2015-10-15 19:18:55 +02:00
Alex Schroeder
db814c627a Transl.: "Module count (only testable modules): "
Remove trailing whitespace.
2015-10-15 19:15:18 +02:00
Alex Schroeder
b156e08d85 Translation: "Summary of your changes: "
Remove trailing whitespace.
2015-10-15 19:13:13 +02:00
Alex Schroeder
cad08ee17c Translation: "Define external redirect: "
Removed trailing whitespace.
2015-10-15 19:10:59 +02:00
Alex Schroeder
7c04ee83e5 Translation: "Internal Page: "
This string was not used correctly. T('Internal Page: ' . $resolved)
means that the string is concatenated with $resolved and then it will be
translated. This means that the translation will never be found. The
correct usage is as follows: Ts('Internal Page: %s', $resolved). The
translation string has therefore been changed to 'Internal Page: %s' and
the translation files have been fixed accordingly.
2015-10-15 19:07:51 +02:00
Alex Schroeder
844d984526 Translation: ", see "
Removed trailing whitespace.
2015-10-15 15:01:51 +02:00
Alex Schroeder
e91797fcba Translation: "Name: " and "URL: "
Removed trailing whitespace.
2015-10-15 14:55:54 +02:00
Alex Schroeder
8ad1c60817 Translation: remove trailing whitespace
The German translation contained a stray trailing whitespace.
2015-10-15 14:47:49 +02:00
Alex Schroeder
e24f853bef Translation: "Email: "
Removed trailing whitespace.
2015-10-15 14:41:55 +02:00
Alex Schroeder
cfceb84cc6 Translation: "Trail: "
Removed trailing whitespace.
2015-10-15 14:36:51 +02:00
Alex Schroeder
e81234d81f Translation: "Translated page: "
Removed trailing whitespace.
2015-10-15 14:34:28 +02:00
Alex Schroeder
5a647e6042 Translation: "This page is a translation of %s. "
Removed trailing whitespace.
2015-10-15 14:27:17 +02:00
Alex Schroeder
a53f3e390f Translation: "Title: "
Removed trailing whitespace.
2015-10-15 14:24:08 +02:00
Alex Schroeder
d9d213b6b3 Translation: "Trags: "
Removed trailing whitespace.
2015-10-15 14:19:16 +02:00
Alex Schroeder
09c5351a11 Translation: fix Return to
This used to be string with a trailing whitespace, but actual use was
wrong: T('Return to ' . NormalToFree($id)) – this concatenates the page
name and then attempts to translate the result, which never works. The
correct usage is Ts('Return to %s', NormalToFree($id)).
2015-10-14 12:39:55 +02:00
Alex Schroeder
f0fc2f2f29 Translation: File to upload: without trailing SPC 2015-10-14 12:35:23 +02:00
Alex Schroeder
7d6138107f Translation: not deleted: without trailing space 2015-10-14 12:32:25 +02:00
Alex Schroeder
446f587a49 Translation: Cookie: no longer required
With or without trailing space, this text is no longer required.
2015-10-14 12:28:22 +02:00
Alex Schroeder
0872ee501e Translation: %s: without trailing space
We can still translate %s: to %s : for the French.
2015-10-14 12:24:31 +02:00
Alex Schroeder
e4f7500340 month-names, national-days: delete cruft
An error in the Makefile treated all *.pl files in the translations
directory as translation files -- including these other files which are
not regular translation files.
2015-10-14 12:23:18 +02:00
Alex Schroeder
a429bb6a4b alex-2015.css: switched fonts
Removed the @font-face rules that downloaded Noticia Text and Symbola
from the net. This was slowing down access from old mobile phones.
Instead, I'm now using a font-family of "Palatino Linotype", "Book
Antiqua", Palatino, serif.
2015-10-14 10:29:45 +02:00
Aleks-Daniel Jakimenko
ce2a39d8f1 Allow custom setting for --jobs in make test 2015-10-13 03:21:11 +03:00
Aleks-Daniel Jakimenko
81a4dbcdcd Alexine: more comfortable default paths 2015-10-13 02:26:00 +03:00
Alex Schroeder
33a3f515a3 big-brother.t: more rebust under heavy load
The previous fix was no good. Now attempting a different fix.
2015-10-12 15:45:09 +02:00
Alex Schroeder
6372907c4b Parallelize tests
Using random numbers to generate new test-directories for every test
file. Use t/setup.pl to reset.
2015-10-12 15:13:22 +02:00
Alex Schroeder
a8b7b67efe Use warnings
search.t: Using braces without escaping them in regular expressions
trigges a warning. wiki.pl will now quotemeta the replacement string
when highlighting changes.

upgrade-files.t: Only remove the old UseMod directory if it actually
exists in order to fix some warnings.

wiki.pl: only reading the log file when open actually succeeded in order
to fix some warnings.
2015-10-12 15:12:20 +02:00
Alex Schroeder
b0f9722857 recaptcha.t: use Captcha::reCAPTCHA always
This module is no longer optional. The test will not skip.
2015-10-12 15:09:15 +02:00
Alex Schroeder
d5fda299b0 Some tests now more rebust under heavy load
When running tests with four jobs on a laptop with just two cores, load
is heavy and some tests may fail. Trying to make them more robust...

- big-brother.t
- captcha.t
2015-10-12 15:08:08 +02:00
Alex Schroeder
725e121731 meta.t: perl -c only takes one file 2015-10-12 14:48:51 +02:00
Alex Schroeder
ad299b6b1d server.pl: instructions for the plugin to use
I've submitted some patches for Mojolicious::Plugin::CGI and the
comments now point to a fork.
2015-10-11 20:02:37 +02:00
Aleks-Daniel Jakimenko
8439566b01 aawrapperdiv.pl deleted 2015-10-11 12:19:29 +03:00
Alex Schroeder
ef4263cf03 Adding wiki.log to .gitignore 2015-10-09 00:48:07 +02:00
Alex Schroeder
464a6e9af1 server.pl is a Mojolicious server for Oddmuse 2015-10-09 00:47:31 +02:00
Aleks-Daniel Jakimenko
7152fa0a54 WikiConfigFile and WikiModuleDir ENV variables
Currently the config file and modules are supposed to be in $DataDir,
which does make any sense from security point of view. Files with code
should not be in directories that are writable by www-data.

Previously you had to use a wrapper script to work around that. Now we
provide special variables.

Please note that oddmuse will sometimes cache data by using Storable.
Such cache is saved to the disk and then read back when required. This,
however, is an insecure operation given that there is a risk that the
file will be manipulated from www-data in a malicious way.
2015-10-06 04:53:21 +03:00
Alex Schroeder
da5c5a8275 New make target: new-utf8.pl
Changes to oddtrans make sure that lines starting with # are comments
and will be stripped when the translation file is read. When writing new
translation files, comments are added to indicate which files are being
processed right now. This will help translators figure out where the
texts originated from. Note that every key appears only once, so
translations will be missing in the section for later files if they
appeared in earlier sections.

Recreated new-utf8.pl in order to illustrate the new format.
2015-09-30 17:35:02 +02:00
Aleks-Daniel Jakimenko
ee52a25ebf ban.t: fix number of tests 2015-09-30 16:27:40 +03:00
Aleks-Daniel Jakimenko
e831c10cd3 ban.t: removing more strange-spam tests 2015-09-30 16:00:30 +03:00
Aleks-Daniel Jakimenko
9a6da39aaf strange-spam.pl module was deleted, deleting tests as well 2015-09-28 11:24:58 +03:00
Aleks-Daniel Jakimenko
99b819dd68 strange-spam.pl: Module deleted
The wiki says that this module is obsolete. If it is, then there is no need to
keep it in our repo.
2015-09-28 11:21:18 +03:00
Aleks-Daniel Jakimenko
0b42ed0508 private-wiki.pl: missing our 2015-09-23 21:30:54 +03:00
Aleks-Daniel Jakimenko
3b6d891dc7 Stop leaving locks behind
Previously if some user cancelled his request (simply by pressing Stop button
in his browser), then the script will receive a TERM signal or the like.

This means that some locks could be left behind, which required someone
to unlock the wiki manually (by using Unlock Wiki action).

Now we remove these locks automatically!

However, some tasks might want to handle such situations gracefully. That's why
%LockCleaners hash was added. Use lock name as the key and put a coderef as a
value. If SIGTERM (or a bunch of other signals) is received, then it will run
this code, which, supposedly, cleans all of the stuff after it. Private Wiki
Extension was changed according to that, so you can see it in action.

Also, tests added!
2015-09-23 21:07:02 +03:00
Alex Schroeder
e77abbc09f big-brother.pl: Make sure restricted URL is used 2015-09-21 17:56:02 +02:00
Alex Schroeder
61a2238d9a Rollback fixes for $KeepDays = 0
RollbackPossible needs to handle the situation where $KeepDays == 0.
DoRollback used to examine all the pages where rollback was
possible (using $KeepDays) but in order to avoid the special case where
$KeepDays == 0, we can also examine all the pages changed after the
target timestamp $to.
2015-09-21 09:58:38 +02:00
Alex Schroeder
0322deaf82 maintain.t: Fix for $KeepDays = 0 2015-09-21 09:42:30 +02:00
Alex Schroeder
044a6ad835 preview.pl: remove trailing whitespace 2015-09-21 09:41:00 +02:00
Alex Schroeder
86fe0193b7 preview.pl: fix typo
'$id' is not interpolated...
2015-09-21 09:39:40 +02:00
Alex Schroeder
2517928c1e conflict.t: typo 2015-09-21 09:34:00 +02:00
Alex Schroeder
971f4b1579 history.t: Fix for $KeepDays = 0 2015-09-21 09:27:48 +02:00
Alex Schroeder
c0f0b970a6 preview.pl: Add admin menu 2015-09-21 09:24:53 +02:00
Alex Schroeder
fa493d7360 preview.pl: New module to report changes in HTML 2015-09-21 09:12:32 +02:00
Alex Schroeder
831de74800 Print cache even if just have a single clean block
Once again, bitten by Perl. We used to print a cache "if ($Page{blocks}
and $Page{flags} and GetParam('cache', $UseCache) > 0)" -- but if we
have exactly one clean block, then flags will be "0" which is false. So
now we're testing for defined $Page{flags}.
2015-09-21 09:09:14 +02:00
Alex Schroeder
755010f619 rollback.t: Fix tests given $KeepDays default
As $KeepDays now defaults to 0, more changes are required to make these
tests work again.
2015-09-20 14:03:18 +02:00
Alex Schroeder
44c7102dd5 rollback.t: fix many tests
This is done by appending "$KeepDays = 14" to the config file whenever
it gets written. This restores the old behaviour and thus "fixes" the
tests.
2015-09-20 11:44:37 +02:00
Aleks-Daniel Jakimenko
e731c16214 Fix crossbar.t after 858ff72
It seems like the test needs any empty page (not exactly HomePage,
which now returns "Welcome!").
2015-09-19 23:02:29 +03:00
Aleks-Daniel Jakimenko
b3b6eeb2bd Explanation for $KeepDays = 0 (0 means forever) 2015-09-19 22:40:12 +03:00
Aleks-Daniel Jakimenko
fc8c0e66a7 Let's have a working page history (no more ForgiveAndForget by default)
There was a huge discussion with a lot of tension:
https://oddmuse.org/wiki/Revolutionary_Changes#ForgiveAndForget
And also the comments:
https://oddmuse.org/wiki/Comments_on_Revolutionary_Changes#ForgiveAndForget

But in the end, it is safer to have a history which is not broken.
Don't get it wrong, ForgiveAndForget is still a good thing, it's just not what
we should do *by default*.

If your wiki does benefit from ForgiveAndForget, then add this to your config:
$KeepDays = 14;

Although this change solves a couple of important problems, it does not address
new ones that arise because of no ForgiveAndForget. Namely it does not
resolve the problem of deleting stuff when you *really* have to do it. For
example, [[DMCA Extension]] (or similarly named extension with the same
purpose) should be developed. These problems existed for a long time, because
people were using “$KeepDays = 0” a lot. It is just that we started to accept
wikis with no ForgiveAndForget more thoroughly.

In other words, this commit is just part of the bigger change.

Why don't we set it to 5 years? Because then it will be a time bomb that will
be triggered unexpectedly. We should have a more predictable default value.
2015-09-19 22:18:15 +03:00
Alex Schroeder
6207434f19 Make sure the "Welcome!" message is shown 2015-09-19 18:30:51 +02:00
Aleks-Daniel Jakimenko
9ecfe306cb load-lang.pl: missing translations, meta test 2015-09-19 08:28:05 +03:00
Aleks-Daniel Jakimenko
a4dd2b8b0a load-lang.pl: Use $ModuleDir/translations by default
Modules are not loaded recursively, so we are free to use any directory inside
$ModuleDir. It is also where translations are located in the git repo.

Also, %library was renamed to %TranslationsLibrary (which is now "our"). This
is required for tests and for custom configuration.
2015-09-19 07:57:33 +03:00
Aleks-Daniel Jakimenko
0868f3a98e Workaround for utf8::decode bug (sometimes utf8 chars were not decoded)
Remember the problem with toc.pl when the whole page was *sometimes* not
utf8-decoded? There were some thoughts that it might be associated with
memory files, and it is correct. Although I was not able to narrow it down
last time, now I did (simply because this problem appeared elsewhere).

If you look at $output variable after utf8::decode with Devel::Peek, you
will see two variants of flags. This one looks good:
   FLAGS = (PADMY,POK,pPOK,UTF8)
And this one is wrong:
   FLAGS = (PADMY,POK,pPOK)
This problem is weird because it works inconsistently. Most of the time
you will get correct output, but sometimes it will be broken.

Someone has to golf it down to something short in order to submit perl
bug report. This, however, does not look like a simple task.

Current workaround is as stupid as it looks like, but it works.
Somehow assigning it to another variable solves the problem (which, by the
way, is similar to solving other perl string-related problems).
2015-09-16 04:13:02 +03:00
Aleks-Daniel Jakimenko
316471b145 server.pl: just whitespace 2015-09-15 04:13:24 +03:00
Alex Schroeder
586972c71d atom.t: Use our own server for this test
We no longer require an existing webserver running a wiki at
http://localhost/cgi-bin/wiki.pl. Instead, we're running our own
stuff/server.pl on a random port and use it for testing -- and we kill
it when we're done.
2015-09-14 13:01:22 +02:00
Alex Schroeder
f6954c4a2e server.pl simplified and with license 2015-09-14 12:40:09 +02:00
Alex Schroeder
3b0d8c9bd6 server.pl: a stand-alone Oddmuse wiki server 2015-09-13 22:40:05 +02:00
Alex Schroeder
3e60aa8e1b atom.pl: use PUTDATA 2015-09-12 19:06:59 +02:00
Alex Schroeder
b3f865a4ab Make Atom tests mandatory
You must have a wiki running at http://localhost/cgi-bin/wiki.pl and it
must have the Atom Extension installed.
2015-09-12 19:06:09 +02:00
Alex Schroeder
64568025c9 Don't skip the tests if XML::Atom is missing 2015-09-12 18:42:27 +02:00
Alex Schroeder
9472a279ea Lots of tests for preview pagination 2015-09-12 00:05:30 +02:00
Alex Schroeder
e0fdeffc94 Link license 2015-09-11 22:18:38 +02:00
Aleks-Daniel Jakimenko
d1d70be583 No more trailing whitespace (again!), meta test added 2015-09-11 18:07:52 +03:00
Alex Schroeder
6260033669 Add pagination to search and replace preview 2015-09-11 15:15:00 +02:00
Alex Schroeder
57a4132512 Fix test for malformed regular expression search 2015-09-11 15:14:34 +02:00
Alex Schroeder
3e16b45dbb Fix ReplaceAndDiff calling convention
ReplaceAndDiff calls Replace, which loops over all pages. That's why we
don't need to call it from SearchTitleAndBody -- that makes our code
runs way too often.
2015-09-11 14:34:01 +02:00
Alex Schroeder
21392f2f1b Fix issue with malformed regular expressions
If the regular expression cannot be compiled using eval { qr/$re/ } we
just use quotemeta($re) instead.
2015-09-11 14:12:29 +02:00
Alex Schroeder
0fd86ee60d Preview button for search and replace
Also, more use of $func->() instead of &$func() syntax.
2015-09-11 14:05:35 +02:00
Aleks-Daniel Jakimenko
fd42ebf9c3 With /x, # has a special meaning (escape it!) 2015-09-11 02:55:18 +03:00
Aleks-Daniel Jakimenko
3180e5b02a smarttitles.pl: allow patterns in #SUBURL
(with a colon) to get interlinks interpreted, but now any link pattern will
be parsed in regular #SUBURL.
2015-09-11 00:41:15 +03:00
Alex Schroeder
26d3852f30 campaignwiki.org uses HTTPS 2015-09-10 08:36:29 +02:00
Alex Schroeder
ad54fda317 light.css: validator says some stuff is invalid 2015-09-07 10:42:05 +02:00
Aleks-Daniel Jakimenko
dca0c75e34 AGPL for Alexine scripts 2015-09-07 05:00:20 +03:00
Aleks-Daniel Jakimenko
ca0f12697b askpage.pl: forgot to add some variables to our (...) 2015-09-07 03:47:23 +03:00
Aleks-Daniel Jakimenko
b8ae7e0817 askpage.pl: changed according to recent oddmuse changes 2015-09-07 03:42:32 +03:00
Aleks-Daniel Jakimenko
3e91bdc75e Do not tell people to write a comment if they're doing it
“There are no comments, yet. Be the first to leave a comment!” – that's what
you will see when you preview your comment on an empty page.

Since the user is already doing it, there is no need to tell that. Also, it
may look like it is part of the preview.

We no longer do that (after this commit). In other words, the preview should
look exactly like the resulting page.
2015-09-07 03:32:55 +03:00
Alex Schroeder
e25a621e6e Big changes to how diffs are generated
The original issue was that looking at all changes (action=rc all=1) the
resulting diff didn't always make sense if you clicked on the diff link.
It showed the difference between that revision and the current revision.
The PrintHtmlDiff sub was changed significantly to make it easier to
understand and to help fix this issue.

The drawback is that it now requires a new key in page and keep files:
lastmajorsummary. It goes with lastmajor and diff-major and records the
summary for that particular edit. As new changes will start recording
this new key, the change will slowly propagate in existing wikis.
Whenever you look at minor diffs, however, the existing summary key is
chosen. Plus, whenever you want to look at differences between
particular revisions, this is equivalent to looking at minor diffs. So
the only situation that is problematic is an edit history like the
following:

A - major change
B - major change (major diff, major summary, last major revision)
C - minor change

When looking at this page with diff=2, we want to show major diff, major
summary, last major revision. If B happened before this commit was
installed, the summary will be missing.
2015-09-06 14:32:36 +02:00
Alex Schroeder
de6a3f1d0c Add comment label back
Commit:8d4c15e removed our ”Write your comment here:” label. This
commit adds it back.
2015-09-06 08:27:43 +02:00
Alex Schroeder
bf00a9ea04 Merge remote-tracking branch 'origin/return-objects' 2015-09-06 08:10:46 +02:00
Aleks-Daniel Jakimenko
f8ac7a2818 aawrapperdiv.pl: wrap PrintFooter correctly 2015-09-06 02:55:36 +03:00
Aleks-Daniel Jakimenko
1cd33b691c Fix for issue #1 on github
Changing everything to return objects is a worthy goal, but for now we have
taken enough destructive steps towards it. Therefore, this commit fixes the
problem in backwards compatible way (by adding one more parameter to the
signatures).

Note that this additional parameter is NOT a timestamp, it is a whole page
object. Which means that we are still moving towards our goal of using page
objects everywhere, this commit is just doing it in a backwards-compatible
way.
2015-09-06 01:10:29 +03:00
Aleks-Daniel Jakimenko
9d7e5b43c0 Test for Issue #1 on github 2015-09-05 23:54:43 +03:00
Alex Schroeder
ceca41d85c google-plus-one.pl: fix plusone action
Privacy Badger is acting up and I think we're better off creating the
buttons dynamically.
2015-09-04 14:00:44 +02:00
Aleks-Daniel Jakimenko
1c4e082755 Return objects where it begs for it
sub ParseData is fully backwards compatible. If some module runs it in list
context, then it will get listified hash like previously. New code should
always run it in scalar context though (everything in our code base
was changed according to that).

sub GetTextRevision is not backwards compatible (don't let “wantarray” usage
to confuse you). Most modules do not touch that subroutine, so we are probably
fine (modules from our git repo that do use were changed accordingly).

“EncodePage(%$page)” looks wrong. It seems like we should change it to accept
hash ref.
2015-09-04 04:55:48 +03:00
Alex Schroeder
aec340b401 rollback.t: add another 1s sleep
Trying to solve an issue: sometimes the test fails on Alex Daniel's
test server but never on Alex Schroeder's laptop. The output of Recent
Changes being tested has no rollback button for one of the page links.
Actually, the last six edits have no rollback button:

12:34 UTC (diff) MinorPage . . . . 127.0.0.1 – Rollback to 2015-09-01 12:34 UTC (minor)
12:34 UTC (diff) AnotherEvilPage . . . . 127.0.0.1 – Rollback to 2015-09-01 12:34 UTC (minor)
12:34 UTC (diff) OtherPage . . . . 127.0.0.1 – Rollback to 2015-09-01 12:34 UTC
12:34 UTC (diff) NicePage . . . . 127.0.0.1 – Rollback to 2015-09-01 12:34 UTC
12:34 UTC (diff) EvilPage . . . . 127.0.0.1 – Rollback to 2015-09-01 12:34 UTC
12:34 UTC (diff) MinorPage . . . . 127.0.0.1 – testerror (minor)

Note that this includes the "testerror" minor edit which is about to
be rolled back. Perhaps that's because this should hold in
RollbackPossible and it does not: $ts != $LastUpdate. $ts would be the
timestamp of the testerror edit and $LastUpdate would be the timestamp
of the rollback. I've added another 1s sleep between these two.
2015-09-02 13:41:12 +02:00
Alex Schroeder
3ea87c007d The parameter days must be numeric 2015-08-31 11:04:22 +02:00
Alex Schroeder
4d8b028e2d test for wiping comments with "0" and fix 2015-08-29 11:57:29 +02:00
Aleks-Daniel Jakimenko
31c02d6e95 oddmuse-quickstart: some progress 2015-08-26 07:05:44 +03:00
Aleks-Daniel Jakimenko
26bf8a3043 oddmuse-quickstart: progress (still not ready) 2015-08-25 07:09:37 +03:00
Aleks-Daniel Jakimenko
ac21a8e6a4 Group pages with comment pages in page index 2015-08-25 04:14:16 +03:00
Aleks-Daniel Jakimenko
a000937768 https links in README 2015-08-24 03:00:56 +03:00
Aleks-Daniel Jakimenko
4eef4d2d76 No more /o, modifiers sorted alphabetically 2015-08-23 21:22:12 +03:00
Alex Schroeder
92410a1f5c add-link.pl: Fix footer 2015-08-23 13:32:41 +02:00
Aleks-Daniel Jakimenko
aa89d08e08 atom.pl: use XML::Atom explicitly
So that it is easier to find the required dependency
2015-08-20 14:59:00 +03:00
Aleks-Daniel Jakimenko
244ddb5157 run-tests: fixed wrong git path 2015-08-20 07:19:06 +03:00
Aleks-Daniel Jakimenko
9c3456c963 run-tests: do push as well 2015-08-20 06:54:41 +03:00
Aleks-Daniel Jakimenko
ad9afbf5ba GPL license for Alexine scripts 2015-08-20 06:47:53 +03:00
Aleks-Daniel Jakimenko
bc079133f7 New script new-release (autoupdate source links) 2015-08-20 06:45:32 +03:00
Aleks-Daniel Jakimenko
69a0f3ed23 Alexine image 2015-08-19 11:28:30 +03:00
Aleks-Daniel Jakimenko
1fc3600329 run-tests: print only 7 characters of a commit 2015-08-19 11:19:31 +03:00
Aleks-Daniel Jakimenko
c1141cd610 run-tests: another repository link
This repository will not only hold test data, but it
will also have some other files associated with Alexine bot.
2015-08-19 11:12:36 +03:00
Aleks-Daniel Jakimenko
300d86b2cd run-tests: fixed newlines 2015-08-19 11:10:48 +03:00
Aleks-Daniel Jakimenko
d609a857c0 run-tests: fixed typo, OK status edits are now minor 2015-08-19 10:57:31 +03:00
Aleks-Daniel Jakimenko
8e98298777 run-tests: fix wikiput path 2015-08-19 10:47:49 +03:00
Aleks-Daniel Jakimenko
0642fad8f8 Afterfix for 5462b21 (disallow minor comments)
Test added as well
2015-08-19 10:17:49 +03:00
Aleks-Daniel Jakimenko
d10d76c475 run-tests: secret key specified 2015-08-19 10:05:45 +03:00
Aleks-Daniel Jakimenko
8aa2f04995 New run-tests script (part of Alexine) 2015-08-19 08:53:21 +03:00
179 changed files with 10873 additions and 8887 deletions

4
.gitignore vendored
View File

@@ -1,8 +1,10 @@
*~
/build/
\#*\#
/test-data
/test-data*
/Mac/pkg/
*.dmg
*.pkg
.DS_Store
wiki.log
.prove

View File

@@ -3,7 +3,7 @@
# subdirectory.
VERSION_NO=$(shell git describe --tags)
TRANSLATIONS=$(wildcard modules/translations/[a-z]*.pl$)
TRANSLATIONS=$(wildcard modules/translations/[a-z]*-utf8.pl$)
MODULES=$(wildcard modules/*.pl)
BUILD=build/wiki.pl $(foreach file, $(notdir $(MODULES)) $(notdir $(TRANSLATIONS)), build/$(file))
@@ -38,11 +38,21 @@ build/month-names-%.pl: modules/translations/month-names-%.pl
build/%.pl: modules/%.pl
perl -lne "s/(AddModuleDescription\('[^']+', '[^']+')\)/\$$1, undef, '$(VERSION_NO)')/; print" < $< > $@
modules/translations/new-utf8.pl: wiki.pl $(MODULES)
cp $@ $@-old
perl stuff/oddtrans -l $@-old wiki.pl $(MODULES) > $@
rm -f $@-old
translations: $(TRANSLATIONS)
for f in $^; do \
echo updating $$f...; \
perl oddtrans -l $$f wiki.pl $(MODULES) > $$f-new && mv $$f-new $$f; \
perl stuff/oddtrans -l $$f wiki.pl $(MODULES) > $$f-new && mv $$f-new $$f; \
done
# Running four jobs in parallel, but clean up data directories without
# race conditions!
jobs ?= 4
test:
prove t
prove t/setup.pl
prove --jobs=$(jobs) --state=slow,save t

View File

@@ -1,5 +1,5 @@
This is the README file distributed together with the
[[http://oddmuse.org/|Oddmuse]] script.
[[https://oddmuse.org/|Oddmuse]] script.
== Installing Oddmuse on a Debian System running Apache
@@ -82,7 +82,7 @@ putting their names in {{{[[double square brackets]]}}}.
Enjoy your wiki experience.
Visit http://www.oddmuse.org/ to learn more about the translation
Visit https://www.oddmuse.org/ to learn more about the translation
files and modules that are part of this package.
== Apache
@@ -136,7 +136,7 @@ simply restart it all:
sudo service apache2 graceful
}}}
----------------------------------------------------------------------
== License
Permission is granted to copy, distribute and/or modify this document
under the terms of the GNU Free Documentation License, Version 1.1 or
@@ -153,5 +153,7 @@ MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
General Public License for more details.
Both the GNU Free Documentation License, and the GNU General Public
License are distributed together with this script. See the files FDL
and GPL, respectively.
License are distributed together with this script. See the files
[[https://github.com/kensanata/oddmuse/blob/master/FDL|FDL]] and
[[https://github.com/kensanata/oddmuse/blob/master/GPL|GPL]],
respectively.

View File

@@ -1,4 +1,4 @@
The files in this directory are used to run http://campaignwiki.org/
The files in this directory are used to run https://campaignwiki.org/
add-link.pl
===========
@@ -8,7 +8,7 @@ bookmark site: A few pages make up a big unordered list of links in
wiki format. add-link is a tool to help users contribute new links to
the list.
http://campaignwiki.org/wiki/LinksToWisdom/HomePage
https://campaignwiki.org/wiki/LinksToWisdom/HomePage
copy.pl
=======
@@ -17,7 +17,7 @@ This is used to copy the text from a web page to a wiki page. The idea
was to keep archive copies of cool pages somewhere. The Blog Archive
never got used, though.
http://campaignwiki.org/wiki/BlogArchive/HomePage
https://campaignwiki.org/wiki/BlogArchive/HomePage
monster-tag.pl
==============
@@ -25,7 +25,7 @@ monster-tag.pl
This is used to quickly tag many pages in the Monsters wiki. The
Monsters wiki hasn't been used in a long time, though.
http://campaignwiki.org/wiki/Monsters/HomePage
https://campaignwiki.org/wiki/Monsters/HomePage
submit.pl
=========
@@ -34,4 +34,4 @@ This used to be used to add sites to the Old School RPG Planet. The
aggregator was configured via a wiki page on the Planet wiki. It's now
abandoned.
http://campaignwiki.org/wiki/Planet/HomePage
https://campaignwiki.org/wiki/Planet/HomePage

View File

@@ -292,6 +292,7 @@ sub main {
Init(); # read config file (no modules!)
$ScriptName = $site; # undo setting in the config file
$FullUrl = $site; #
InitPageVariables(); # call again: $ScriptName was wrong
binmode(STDOUT,':utf8');
$q->charset('utf8');
if ($q->path_info eq '/source') {

View File

@@ -1,110 +1,7 @@
/* This file is in the public domain. */
/* @import url(https://fonts.googleapis.com/css?family=Noticia+Text:400,400italic,700italic,700&subset=latin,latin-ext); */
/* vietnamese */
@font-face {
font-family: 'Noticia Text';
font-style: normal;
font-weight: 400;
src: local('Noticia Text'), local('NoticiaText-Regular)'), url('/fonts/NoticiaText-Regular.woff') format('woff');
unicode-range: U+0102-0103, U+1EA0-1EF1, U+20AB;
}
/* latin-ext */
@font-face {
font-family: 'Noticia Text';
font-style: normal;
font-weight: 400;
src: local('Noticia Text'), local('NoticiaText-Regular)'), url('/fonts/NoticiaText-Regular.woff') format('woff');
unicode-range: U+0100-024F, U+1E00-1EFF, U+20A0-20AB, U+20AD-20CF, U+2C60-2C7F, U+A720-A7FF;
}
/* latin */
@font-face {
font-family: 'Noticia Text';
font-style: normal;
font-weight: 400;
src: local('Noticia Text'), local('NoticiaText-Regular)'), url('/fonts/NoticiaText-Regular.woff') format('woff');
unicode-range: U+0000-00FF, U+0131, U+0152-0153, U+02C6, U+02DA, U+02DC, U+2000-206F, U+2074, U+20AC, U+2212, U+2215, U+E0FF, U+EFFD, U+F000;
}
/* vietnamese */
@font-face {
font-family: 'Noticia Text';
font-style: normal;
font-weight: 700;
src: local('Noticia Text Bold'), local('NoticiaText-Bold)'), url('/fonts/NoticiaText-Bold.woff') format('woff');
unicode-range: U+0102-0103, U+1EA0-1EF1, U+20AB;
}
/* latin-ext */
@font-face {
font-family: 'Noticia Text';
font-style: normal;
font-weight: 700;
src: local('Noticia Text Bold'), local('NoticiaText-Bold)'), url('/fonts/NoticiaText-Bold.woff') format('woff');
unicode-range: U+0100-024F, U+1E00-1EFF, U+20A0-20AB, U+20AD-20CF, U+2C60-2C7F, U+A720-A7FF;
}
/* latin */
@font-face {
font-family: 'Noticia Text';
font-style: normal;
font-weight: 700;
src: local('Noticia Text Bold'), local('NoticiaText-Bold)'), url('/fonts/NoticiaText-Bold.woff') format('woff');
unicode-range: U+0000-00FF, U+0131, U+0152-0153, U+02C6, U+02DA, U+02DC, U+2000-206F, U+2074, U+20AC, U+2212, U+2215, U+E0FF, U+EFFD, U+F000;
}
/* vietnamese */
@font-face {
font-family: 'Noticia Text';
font-style: italic;
font-weight: 400;
src: local('Noticia Text Italic'), local('NoticiaText-Italic)'), url('/fonts/NoticiaText-Italic.woff') format('woff');
unicode-range: U+0102-0103, U+1EA0-1EF1, U+20AB;
}
/* latin-ext */
@font-face {
font-family: 'Noticia Text';
font-style: italic;
font-weight: 400;
src: local('Noticia Text Italic'), local('NoticiaText-Italic)'), url('/fonts/NoticiaText-Italic.woff') format('woff');
unicode-range: U+0100-024F, U+1E00-1EFF, U+20A0-20AB, U+20AD-20CF, U+2C60-2C7F, U+A720-A7FF;
}
/* latin */
@font-face {
font-family: 'Noticia Text';
font-style: italic;
font-weight: 400;
src: local('Noticia Text Italic'), local('NoticiaText-Italic)'), url('/fonts/NoticiaText-Italic.woff') format('woff');
unicode-range: U+0000-00FF, U+0131, U+0152-0153, U+02C6, U+02DA, U+02DC, U+2000-206F, U+2074, U+20AC, U+2212, U+2215, U+E0FF, U+EFFD, U+F000;
}
/* vietnamese */
@font-face {
font-family: 'Noticia Text';
font-style: italic;
font-weight: 700;
src: local('Noticia Text Bold Italic'), local('NoticiaText-BoldItalic)'), url('/fonts/NoticiaText-BoldItalic.woff') format('woff');
unicode-range: U+0102-0103, U+1EA0-1EF1, U+20AB;
}
/* latin-ext */
@font-face {
font-family: 'Noticia Text';
font-style: italic;
font-weight: 700;
src: local('Noticia Text Bold Italic'), local('NoticiaText-BoldItalic)'), url('/fonts/NoticiaText-BoldItalic.woff') format('woff');
unicode-range: U+0100-024F, U+1E00-1EFF, U+20A0-20AB, U+20AD-20CF, U+2C60-2C7F, U+A720-A7FF;
}
/* latin */
@font-face {
font-family: 'Noticia Text';
font-style: italic;
font-weight: 700;
src: local('Noticia Text Bold Italic'), local('NoticiaText-BoldItalic)'), url('/fonts/NoticiaText-BoldItalic.woff') format('woff');
unicode-range: U+0000-00FF, U+0131, U+0152-0153, U+02C6, U+02DA, U+02DC, U+2000-206F, U+2074, U+20AC, U+2212, U+2215, U+E0FF, U+EFFD, U+F000;
}
@font-face {
font-family: 'Symbola';
src: local('Symbola'), url('/fonts/Symbola.woff') format('woff'), url('/fonts/Symbola.ttf') format('truetype');
}
body, rss {
font-family: "Noticia Text", Symbola, serif;
font-family: "Palatino Linotype", "Book Antiqua", Palatino, serif;
font-style: normal;
font-size: 14pt;
margin: 1em 3em;

View File

@@ -8,110 +8,8 @@
@import url(https://fonts.googleapis.com/css?family=Noticia+Text:400,400italic,700italic,700&subset=latin,latin-ext); */
/* vietnamese */
@font-face {
font-family: 'Noticia Text';
font-style: normal;
font-weight: 400;
src: local('Noticia Text'), local('NoticiaText-Regular)'), url('/fonts/NoticiaText-Regular.woff') format('woff');
unicode-range: U+0102-0103, U+1EA0-1EF1, U+20AB;
}
/* latin-ext */
@font-face {
font-family: 'Noticia Text';
font-style: normal;
font-weight: 400;
src: local('Noticia Text'), local('NoticiaText-Regular)'), url('/fonts/NoticiaText-Regular.woff') format('woff');
unicode-range: U+0100-024F, U+1E00-1EFF, U+20A0-20AB, U+20AD-20CF, U+2C60-2C7F, U+A720-A7FF;
}
/* latin */
@font-face {
font-family: 'Noticia Text';
font-style: normal;
font-weight: 400;
src: local('Noticia Text'), local('NoticiaText-Regular)'), url('/fonts/NoticiaText-Regular.woff') format('woff');
unicode-range: U+0000-00FF, U+0131, U+0152-0153, U+02C6, U+02DA, U+02DC, U+2000-206F, U+2074, U+20AC, U+2212, U+2215, U+E0FF, U+EFFD, U+F000;
}
/* vietnamese */
@font-face {
font-family: 'Noticia Text';
font-style: normal;
font-weight: 700;
src: local('Noticia Text Bold'), local('NoticiaText-Bold)'), url('/fonts/NoticiaText-Bold.woff') format('woff');
unicode-range: U+0102-0103, U+1EA0-1EF1, U+20AB;
}
/* latin-ext */
@font-face {
font-family: 'Noticia Text';
font-style: normal;
font-weight: 700;
src: local('Noticia Text Bold'), local('NoticiaText-Bold)'), url('/fonts/NoticiaText-Bold.woff') format('woff');
unicode-range: U+0100-024F, U+1E00-1EFF, U+20A0-20AB, U+20AD-20CF, U+2C60-2C7F, U+A720-A7FF;
}
/* latin */
@font-face {
font-family: 'Noticia Text';
font-style: normal;
font-weight: 700;
src: local('Noticia Text Bold'), local('NoticiaText-Bold)'), url('/fonts/NoticiaText-Bold.woff') format('woff');
unicode-range: U+0000-00FF, U+0131, U+0152-0153, U+02C6, U+02DA, U+02DC, U+2000-206F, U+2074, U+20AC, U+2212, U+2215, U+E0FF, U+EFFD, U+F000;
}
/* vietnamese */
@font-face {
font-family: 'Noticia Text';
font-style: italic;
font-weight: 400;
src: local('Noticia Text Italic'), local('NoticiaText-Italic)'), url('/fonts/NoticiaText-Italic.woff') format('woff');
unicode-range: U+0102-0103, U+1EA0-1EF1, U+20AB;
}
/* latin-ext */
@font-face {
font-family: 'Noticia Text';
font-style: italic;
font-weight: 400;
src: local('Noticia Text Italic'), local('NoticiaText-Italic)'), url('/fonts/NoticiaText-Italic.woff') format('woff');
unicode-range: U+0100-024F, U+1E00-1EFF, U+20A0-20AB, U+20AD-20CF, U+2C60-2C7F, U+A720-A7FF;
}
/* latin */
@font-face {
font-family: 'Noticia Text';
font-style: italic;
font-weight: 400;
src: local('Noticia Text Italic'), local('NoticiaText-Italic)'), url('/fonts/NoticiaText-Italic.woff') format('woff');
unicode-range: U+0000-00FF, U+0131, U+0152-0153, U+02C6, U+02DA, U+02DC, U+2000-206F, U+2074, U+20AC, U+2212, U+2215, U+E0FF, U+EFFD, U+F000;
}
/* vietnamese */
@font-face {
font-family: 'Noticia Text';
font-style: italic;
font-weight: 700;
src: local('Noticia Text Bold Italic'), local('NoticiaText-BoldItalic)'), url('/fonts/NoticiaText-BoldItalic.woff') format('woff');
unicode-range: U+0102-0103, U+1EA0-1EF1, U+20AB;
}
/* latin-ext */
@font-face {
font-family: 'Noticia Text';
font-style: italic;
font-weight: 700;
src: local('Noticia Text Bold Italic'), local('NoticiaText-BoldItalic)'), url('/fonts/NoticiaText-BoldItalic.woff') format('woff');
unicode-range: U+0100-024F, U+1E00-1EFF, U+20A0-20AB, U+20AD-20CF, U+2C60-2C7F, U+A720-A7FF;
}
/* latin */
@font-face {
font-family: 'Noticia Text';
font-style: italic;
font-weight: 700;
src: local('Noticia Text Bold Italic'), local('NoticiaText-BoldItalic)'), url('/fonts/NoticiaText-BoldItalic.woff') format('woff');
unicode-range: U+0000-00FF, U+0131, U+0152-0153, U+02C6, U+02DA, U+02DC, U+2000-206F, U+2074, U+20AC, U+2212, U+2215, U+E0FF, U+EFFD, U+F000;
}
@font-face {
font-family: 'Symbola';
src: local('Symbola'), url('/fonts/Symbola.woff') format('woff');
}
body {
font-family: "Noticia Text", Symbola, serif;
font-family: "Palatino Linotype", "Book Antiqua", Palatino, serif;
font-size: 14pt;
color: #000;
background-color: #eed;
@@ -119,13 +17,13 @@ body {
}
textarea, pre, code, tt {
font-family: "Andale Mono", Monaco, "Courier New", Courier, monospace, Symbola;
font-size: 80%;
font-family: "Andale Mono", Monaco, "Courier New", Courier, monospace, Symbola;
font-size: 80%;
}
@media print {
body {
background-color: white;
background-color: white;
font-family: Times, serif;
font-size:10pt;
}
@@ -175,9 +73,9 @@ input#mail, input#homepage, input#username {
/* titles */
h1 {
font-weight: bold;
font-size: 150%;
padding: 1em 0;
font-weight: bold;
font-size: 150%;
padding: 1em 0;
}
h1 a:link, h1 a:visited {
color: inherit;
@@ -217,7 +115,7 @@ a:active {
border: 1px solid #9d8;
border-radius: 5px;
box-shadow: 0px 1px 3px white inset,
0px 1px 3px black;
0px 1px 3px black;
}
.button a {
text-decoration: none;
@@ -231,10 +129,6 @@ a:active {
font-weight: normal;
}
a.edit, div.footer, form, span.gotobar, a.number span { display:none; }
a[class="url number"]:after, a[class="inter number"]:after {
content:"[" attr(href) "]";
}
a[class="local number"]:after { content:"[" attr(title) "]"; }
img[smiley] { line-height: inherit; }
}
@@ -243,15 +137,15 @@ a.pencil { display: none; }
/* table of contents */
.toc {
font-size: smaller;
border-left: 1em solid #886;
font-size: smaller;
border-left: 1em solid #886;
}
.toc ol {
list-style-type: none;
padding-left: 1em;
list-style-type: none;
padding-left: 1em;
}
.toc a {
font-weight: normal;
font-weight: normal;
}
/* images with links, captions, etc */
@@ -314,19 +208,19 @@ div.rc li { padding-bottom: 0.5em; }
/* Tables */
table.user {
margin: 1em 0;
padding: 0 1em;
border-top: 1px solid black;
border-bottom: 1px solid black;
margin: 1em 0;
padding: 0 1em;
border-top: 1px solid black;
border-bottom: 1px solid black;
}
div.aside table.user {
margin: 1em 0;
padding: 0;
margin: 1em 0;
padding: 0;
}
table.user td, table.user th {
border-style: none;
padding:5px 10px;
vertical-align: top;
border-style: none;
padding:5px 10px;
vertical-align: top;
}
table.user th { font-weight:bold; }
table.user td.r { text-align:right; }
@@ -337,7 +231,7 @@ table.user td.mark { background-color:yellow; }
tr:empty { display: block; height: 0.5em; }
@media print {
table {
font-size: 9pt;
font-size: 9pt;
margin: 0;
}
table.user td, table.user th {

View File

@@ -1,43 +0,0 @@
# Copyright (C) 2004, 2005 Fletcher T. Penney <fletcher@freeshell.org>
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation; either version 2 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the
# Free Software Foundation, Inc.
# 59 Temple Place, Suite 330
# Boston, MA 02111-1307 USA
use strict;
use v5.10;
AddModuleDescription('aawrapperdiv.pl', 'WrapperDiv Module');
our ($q);
*OldGetHeader = \&GetHeader;
*GetHeader = \&WrapperGetHeader;
sub WrapperGetHeader {
my ($id, $title, $oldId, $nocache, $status) = @_;
my $result = OldGetHeader ($id, $title, $oldId, $nocache, $status);
$result .= $q->start_div({-class=>'wrapper'});
}
*OldPrintFooter = \&PrintFooter;
*PrintFooter = \&WrapperPrintFooter;
sub WrapperPrintFooter {
my ($id, $rev, $comment) = @_;
print $q->start_div({-class=>'wrapper close'});
print $q->end_div(), $q->end_div();
OldPrintFooter($id, $rev, $comment);
}

View File

@@ -22,7 +22,7 @@ our (@MyRules, $FreeLinkPattern);
push(@MyRules, \&LinksWithAccessKeys);
sub LinksWithAccessKeys {
if (m/\G(\[\[$FreeLinkPattern\{(.)\}\]\])/cog) {
if (m/\G(\[\[$FreeLinkPattern\{(.)\}\]\])/cg) {
my ($id, $key) = ($2, $3);
Dirty($1);
$id = FreeToNormal($id);

View File

@@ -36,7 +36,7 @@ sub AdminPowerDelete {
OpenPage($id);
my $status = DeletePage($id);
if ($status) {
print $q->p(GetPageLink($id) . ' ' . T('not deleted: ')) . $status;
print $q->p(GetPageLink($id) . ' ' . T('not deleted:') . ' ' . $status);
} else {
print $q->p(GetPageLink($id) . ' ' . T('deleted'));
WriteRcLog($id, Ts('Deleted %s', $id), 0, $Page{revision},

View File

@@ -26,7 +26,7 @@ our ($q, $bol, %Action, %Page, $OpenPageName, $UseDiff, $UsePathInfo, $RssStyleS
push(@MyRules, \&AggregateRule);
sub AggregateRule {
if ($bol && m/\G(&lt;aggregate\s+((("[^\"&]+",?\s*)+)|(sort\s+)?search\s+(.+?))&gt;)/gc) {
if ($bol && m/\G(&lt;aggregate\s+((("[^\"&]+",?\s*)+)|(sort\s+)?search\s+(.+?))&gt;)/cg) {
Clean(CloseHtmlEnvironments());
Dirty($1);
my ($oldpos, $old_, $str, $sort, $search) = ((pos), $_, $3, $5, $6);
@@ -126,8 +126,8 @@ sub DoAggregate {
}
}
foreach my $id (@pages) {
my %data = ParseData(ReadFileOrDie(GetPageFile(FreeToNormal($id))));
my $page = $data{text};
my $data = ParseData(ReadFileOrDie(GetPageFile(FreeToNormal($id))));
my $page = $data->{text};
my $size = length($page);
my $i = index($page, "\n=");
my $j = index($page, "\n----");
@@ -136,13 +136,13 @@ sub DoAggregate {
$page =~ s/^=.*\n//; # if it starts with a header
my $name = $id;
$name =~ s/_/ /g;
my $date = TimeToRFC822($data{ts});
my $host = $data{host};
my $username = $data{username};
my $date = TimeToRFC822($data->{ts});
my $host = $data->{host};
my $username = $data->{username};
$username = QuoteHtml($username);
$username = $host unless $username;
my $minor = $data{minor};
my $revision = $data{revision};
my $minor = $data->{minor};
my $revision = $data->{revision};
my $cluster = GetCluster($page);
my $description = ToString(sub { ApplyRules(QuoteHtml($page), 1, 0, undef, 'p') });
$description .= $q->p(GetPageLink($id, T('Learn more...')))

View File

@@ -26,11 +26,11 @@ AddModuleDescription('agree-disagree.pl', 'AgreeDisagreePlugin');
push(@MyRules, \&AgreeDisagreeSupportRule);
push(@MyMacros, sub{ s/\[\+\]/"[+:" . GetParam('username', T('Anonymous'))
. ':' . TimeToText($Now) . "]"/ge });
push(@MyMacros, sub{ s/\[\+(:[^]:]+)\]/"[+$1:" . TimeToText($Now) . "]"/ge });
. ':' . TimeToText($Now) . "]"/eg });
push(@MyMacros, sub{ s/\[\+(:[^]:]+)\]/"[+$1:" . TimeToText($Now) . "]"/eg });
push(@MyMacros, sub{ s/\[\-\]/"[-:" . GetParam('username', T('Anonymous'))
. ':' . TimeToText($Now) . "]"/ge });
push(@MyMacros, sub{ s/\[\-(:[^]:]+)\]/"[-$1:" . TimeToText($Now) . "]"/ge });
. ':' . TimeToText($Now) . "]"/eg });
push(@MyMacros, sub{ s/\[\-(:[^]:]+)\]/"[-$1:" . TimeToText($Now) . "]"/eg });
$DefaultStyleSheet .= <<'EOT' unless $DefaultStyleSheet =~ /div\.agree/; # mod_perl?
@@ -78,17 +78,17 @@ EOT
sub AgreeDisagreeSupportRule {
if ($bol) {
if ($bol && m/(\G(\s*\[\+(.*?)\]|\s*\[-(.*?)\])+)/gcs) {
if ($bol && m/(\G(\s*\[\+(.*?)\]|\s*\[-(.*?)\])+)/cgs) {
my $votes = $1;
my @ayes = ();
my @nayes = ();
while ($votes =~ m/\G.*?\[\+(.*?)\]/gcs) {
while ($votes =~ m/\G.*?\[\+(.*?)\]/cgs) {
my ($ignore, $name, $time) = split(/:/, $1, 3);
push(@ayes, $name);
}
my $votes2 = $votes;
while ($votes2 =~ m/\G.*?\[-(.*?)\]/gcs) {
while ($votes2 =~ m/\G.*?\[-(.*?)\]/cgs) {
my ($ignore, $name, $time) = split(/:/, $1, 3);
push(@nayes, $name);
}

View File

@@ -21,13 +21,13 @@ our ($q, %Page, $FootnoteNumber, $FreeLinkPattern, @MyRules, $BracketWiki);
push(@MyRules, \&AnchorsRule);
sub AnchorsRule {
if (m/\G\[\[\#$FreeLinkPattern\]\]/gc) {
if (m/\G\[\[\#$FreeLinkPattern\]\]/cg) {
return $q->a({-href=>'#' . FreeToNormal($1), -class=>'local anchor'}, $1);
} elsif ($BracketWiki && m/\G\[\[\#$FreeLinkPattern\|([^\]]+)\]\]/gc) {
} elsif ($BracketWiki && m/\G\[\[\#$FreeLinkPattern\|([^\]]+)\]\]/cg) {
return $q->a({-href=>'#' . FreeToNormal($1), -class=>'local anchor'}, $2);
} elsif ($BracketWiki && m/\G(\[\[$FreeLinkPattern\#$FreeLinkPattern\|([^\]]+)\]\])/cog
or m/\G(\[\[\[$FreeLinkPattern\#$FreeLinkPattern\]\]\])/cog
or m/\G(\[\[$FreeLinkPattern\#$FreeLinkPattern\]\])/cog) {
} elsif ($BracketWiki && m/\G(\[\[$FreeLinkPattern\#$FreeLinkPattern\|([^\]]+)\]\])/cg
or m/\G(\[\[\[$FreeLinkPattern\#$FreeLinkPattern\]\]\])/cg
or m/\G(\[\[$FreeLinkPattern\#$FreeLinkPattern\]\])/cg) {
# This one is not a dirty rule because the output is always a page
# link, never an edit link (unlike normal free links).
my $bracket = (substr($1, 0, 3) eq '[[[');
@@ -47,7 +47,7 @@ sub AnchorsRule {
$text = $id unless $text;
$text =~ s/_/ /g;
return ScriptLink(UrlEncode($id), $text, $class, undef, $title);
} elsif (m/\G\[\:$FreeLinkPattern\]/gc) {
} elsif (m/\G\[\:$FreeLinkPattern\]/cg) {
return $q->a({-name=>FreeToNormal($1), -class=>'anchor'}, '');
}
return;

View File

@@ -34,7 +34,7 @@ push(@MyRules, \&MaskEmailRule);
sub MaskEmailRule {
# Allow [email@foo.bar Email Me] links
if (m/\G\[($EmailRegExp(\s\w+)*\s*)\]/igc) {
if (m/\G\[($EmailRegExp(\s\w+)*\s*)\]/cgi) {
my $chunk = $1;
$chunk =~ s/($EmailRegExp)//i;
my $email = $1;
@@ -51,7 +51,7 @@ sub MaskEmailRule {
return "<a href=\"mailto:$email\">$chunk</a>";
}
if (m/\G($EmailRegExp)/igc) {
if (m/\G($EmailRegExp)/cgi) {
my $email = $1;
if ($DoMaskEmail) {
my $masked="";

View File

@@ -20,7 +20,7 @@ AddModuleDescription('askpage.pl', 'Ask Page Extension');
use Fcntl qw(:DEFAULT :flock);
our ($DataDir);
our ($DataDir, %Translate, @MyFooters);
our ($AskPage, $QuestionPage, $NewQuestion);
# Don't forget to set your $CommentsPattern to include both $AskPage and $QuestionPage
$AskPage = 'Ask';
@@ -39,8 +39,8 @@ sub IncrementInFile {
return $num;
}
*OldAskPageDoPost=\&DoPost;
*DoPost=\&NewAskPageDoPost;
*OldAskPageDoPost = \&DoPost;
*DoPost = \&NewAskPageDoPost;
sub NewAskPageDoPost {
my $id = FreeToNormal(shift);
if ($id eq $AskPage and not GetParam('text', undef)) { # comment, not a regular edit
@@ -51,18 +51,18 @@ sub NewAskPageDoPost {
OldAskPageDoPost($id, @_); # keep original functionality for regular edits
}
*OldAskPageGetTextArea=\&GetTextArea;
*GetTextArea=\&NewAskPageGetTextArea;
sub NewAskPageGetTextArea {
my ($name, $text, @rest) = @_;
if ($name eq 'aftertext' and not $text and GetId() eq $AskPage) {
$text = $NewQuestion;
}
OldAskPageGetTextArea($name, $text, @rest);
*OldAskPageGetCommentForm = \&GetCommentForm;
*GetCommentForm = \&NewAskPageGetCommentForm;
@MyFooters = map { $_ == \&OldAskPageGetCommentForm ? \&NewAskPageGetCommentForm : $_ } @MyFooters;
sub NewAskPageGetCommentForm {
my ($id) = @_;
$Translate{'Add your comment here:'} = $NewQuestion if $id eq $AskPage;
OldAskPageGetCommentForm(@_);
}
*OldAskPageJournalSort=\&JournalSort;
*JournalSort=\&NewAskPageJournalSort;
*OldAskPageJournalSort = \&JournalSort;
*JournalSort = \&NewAskPageJournalSort;
sub NewAskPageJournalSort {
return OldAskPageJournalSort() unless $a =~ m/^$QuestionPage\d+$/ and $b =~ m/^$QuestionPage\d+$/;
($b =~ m/$QuestionPage(\d+)/)[0] <=> ($a =~ m/$QuestionPage(\d+)/)[0];

View File

@@ -16,6 +16,7 @@
use strict;
use v5.10;
use XML::Atom;
use XML::Atom::Entry;
use XML::Atom::Link;
use XML::Atom::Person;
@@ -149,7 +150,7 @@ sub GetRcAtom {
# Based on DoPost
sub DoAtomSave {
my ($type, $oldid) = @_;
my $entry = AtomEntry();
my $entry = AtomEntry($type);
my $title = $entry->title();
my $author = $entry->author();
SetParam('username', $author->name) if $author; # Used in Save()
@@ -230,15 +231,8 @@ sub DoAtomGet {
}
sub AtomEntry {
my $data = $q->param('POSTDATA');
if (not $data) {
# CGI provides POSTDATA for POST requests, not for PUT requests.
# The following code is based on the CGI->init code.
my $content_length = defined($ENV{'CONTENT_LENGTH'}) ? $ENV{'CONTENT_LENGTH'} : 0;
if ($content_length > 0 and $content_length < $MaxPost) {
$q->read_from_client(\$data, $content_length, 0);
}
}
my $type = shift || 'POST';
my $data = $q->param($type . 'DATA'); # PUTDATA or POSTDATA
my $entry = XML::Atom::Entry->new(\$data);
return $entry;
}

View File

@@ -175,7 +175,7 @@ sub UserCanEditAutoLockFix {
return 0 if $LockOnCreation{$id} and not -f GetPageFile($id); # new page
return 0 if !$EditAllowed or -f $NoEditFile;
return 0 if $editing and UserIsBanned(); # this call is more expensive
return 0 if $EditAllowed >= 2 and (not $CommentsPrefix or $id !~ /^$CommentsPrefix/o);
return 0 if $EditAllowed >= 2 and (not $CommentsPrefix or $id !~ /^$CommentsPrefix/);
return 1 if $EditAllowed >= 3 and ($comment or (GetParam('aftertext', '') and not GetParam('text', '')));
return 0 if $EditAllowed >= 3;
return 1;

View File

@@ -117,7 +117,7 @@ sub GetBackLink {
foreach my $backlink (@backlinks) {
my ($class, $resolved, $title, $exists) = ResolveId($backlink);
if (($resolved ne $id) && ($resolved !~ /^($BacklinkBanned)$/)) {
push(@unpopped, ScriptLink(UrlEncode($resolved), $resolved, $class . ' backlink', undef, T('Internal Page: ' . $resolved)));
push(@unpopped, ScriptLink(UrlEncode($resolved), $resolved, $class . ' backlink', undef, Ts('Internal Page: %s', $resolved)));
}
}

View File

@@ -124,10 +124,10 @@ sub NewBanContributorsWriteRcLog {
and $OpenPageName eq $id and UserIsAdmin()) {
# we currently have the clean page loaded, so we need to reload
# the spammed revision (there is a possible race condition here)
my ($old) = GetTextRevision($Page{revision}-1, 1);
my %urls = map {$_ => 1 } $old =~ /$UrlPattern/og;
my $old = GetTextRevision($Page{revision} - 1, 1)->{text};
my %urls = map {$_ => 1 } $old =~ /$UrlPattern/g;
# we open the file again to force a load of the despammed page
foreach my $url ($Page{text} =~ /$UrlPattern/og) {
foreach my $url ($Page{text} =~ /$UrlPattern/g) {
delete($urls{$url});
}
# we also remove any candidates that are already banned
@@ -153,7 +153,7 @@ sub NewBanContributorsWriteRcLog {
$q->submit(T('Ban!'))),
$q->end_form();
};
print $q->p(T("Consider banning the IP number as well: "),
print $q->p(T("Consider banning the IP number as well:"), ' ',
ScriptLink('action=ban;id=' . UrlEncode($id), T('Ban contributors')));
};
return OldBanContributorsWriteRcLog(@_);

View File

@@ -61,12 +61,12 @@ sub bbCodeRule {
return AddHtmlEnvironment('strong', qq{class="highlight"}); }
elsif ($tag eq 'url') {
if ($option) {
$option =~ /^($UrlProtocols)/o;
$option =~ /^($UrlProtocols)/;
my $class = "url $1";
return AddHtmlEnvironment('a', qq{href="$option" class="$class"}); }
elsif (/\G$FullUrlPattern\s*\[\/url\]/cogi) {
elsif (/\G$FullUrlPattern\s*\[\/url\]/cgi) {
return GetUrl($1); }}
elsif ($tag eq 'img' and /\G$FullUrlPattern\s*\[\/img\]/cogi) {
elsif ($tag eq 'img' and /\G$FullUrlPattern\s*\[\/img\]/cgi) {
return GetUrl($1, undef, undef, 1); } # force image
elsif ($tag eq 'quote') {
my $html = CloseHtmlEnvironments();

View File

@@ -60,7 +60,6 @@ sub AddRecentVisitor {
my $url = ScriptUrl(join(';', "action=$action;id=" . UrlEncode($id),
map { $_ . '=' . UrlEncode(GetParam($_)) }
keys %params));
my $url = $q->url(-path_info=>1,-query=>1);
my $download = GetParam('action', 'browse') eq 'download'
|| GetParam('download', 0)
|| $q->path_info() =~ m/\/download\//;

View File

@@ -28,10 +28,10 @@ push(@MyRules, \&BlockQuoteRule);
sub BlockQuoteRule {
# indented text using : with the option of spanning multiple text
# paragraphs (but not lists etc).
if (InElement('blockquote') && m/\G(\s*\n)+:[ \t]*/cog) {
if (InElement('blockquote') && m/\G(\s*\n)+:[ \t]*/cg) {
return CloseHtmlEnvironmentUntil('blockquote')
. AddHtmlEnvironment('p');
} elsif ($bol && m/\G(\s*\n)*:[ \t]*/cog) {
} elsif ($bol && m/\G(\s*\n)*:[ \t]*/cg) {
return CloseHtmlEnvironments()
. AddHtmlEnvironment('blockquote')
. AddHtmlEnvironment('p');

View File

@@ -74,7 +74,7 @@ sub Cal {
$link .= ScriptLink('action=collect;match=' . UrlEncode($re), $day, 'local collection' . $class);
}
$link;
}}ge;
}}eg;
$cal =~ s{(\S+) (\d\d\d\d)}{{
my ($month_text, $year_text) = ($1, $2);
my $date = sprintf("%d-%02d", $year, $mon);
@@ -118,22 +118,22 @@ sub DoCollect {
push(@MyRules, \&CalendarRule);
sub CalendarRule {
if (/\G(calendar:(\d\d\d\d))/gc) {
if (/\G(calendar:(\d\d\d\d))/cg) {
my $oldpos = pos;
Clean(CloseHtmlEnvironments());
Dirty($1);
PrintYearCalendar($2);
pos = $oldpos;
return AddHtmlEnvironment('p');
} elsif (/\G(month:(\d\d\d\d)-(\d\d))/gc) {
} elsif (/\G(month:(\d\d\d\d)-(\d\d))/cg) {
my $oldpos = pos;
Clean(CloseHtmlEnvironments());
Dirty($1);
print Cal($2, $3);
pos = $oldpos;
return AddHtmlEnvironment('p');
} elsif (/\G(month:([+-]\d\d?))/gc
or /\G(\[\[month:([+-]\d\d?) $FreeLinkPattern\]\])/gc) {
} elsif (/\G(month:([+-]\d\d?))/cg
or /\G(\[\[month:([+-]\d\d?) $FreeLinkPattern\]\])/cg) {
my $oldpos = pos;
Clean(CloseHtmlEnvironments());
Dirty($1);

View File

@@ -77,7 +77,7 @@ sub DoCheckBox{
$summary{$3} = 0 if $2 eq 'x' or $2 eq 'X';
"${1}[[ :${3}]]";
}
}eig;
}egi;
SetParam('text', $text);
SetParam('summary', join(', ', map {
if ($summary{$_}) {

View File

@@ -62,7 +62,7 @@ foreach (@ClusterMapAdminPages){
}
sub ClusterMapRule {
if (/\G^([\n\r]*\<\s*clustermap\s*\>\s*)$/mgc) {
if (/\G^([\n\r]*\<\s*clustermap\s*\>\s*)$/cgm) {
Dirty($1);
my $oldpos = pos;
my $oldstr = $_;

View File

@@ -30,7 +30,7 @@ sub CommentDivWrapper {
return $q->start_div({-class=>'userComment'});
}
}
if ($OpenPageName =~ /$CommentsPattern/o) {
if ($OpenPageName =~ /$CommentsPattern/) {
if ($bol and m/\G(\s*\n)*----+[ \t]*\n?/cg) {
my $html = CloseHtmlEnvironments()
. ($CommentDiv++ > 0 ? $q->end_div() : $q->h2({-class=>'commentsHeading'}, T('Comments:'))) . $q->start_div({-class=>'userComment'})

View File

@@ -52,15 +52,15 @@ sub NewCommentcountScriptLink {
if ($CommentsPrefix && $action =~ /^$CommentsPrefix(.*)/) { # TODO use $CommentsPattern ?
# Add the number of comments here
my $id = $action;
$id =~ s/%([0-9a-f][0-9a-f])/chr(hex($1))/ge; # undo urlencode
$id =~ s/%([0-9a-f][0-9a-f])/chr(hex($1))/eg; # undo urlencode
my $comments = GetPageContent($id);
my $num = 0;
if($comments =~ /=== (\d+) Comments?\. ===/) {
$num = $1;
}
# Fix plurals
my $plural = T('Comments on ');
my $singular = T('Comment on ');
my $plural = T('Comments on');
my $singular = T('Comment on');
$text =~ s/$plural/$singular/ if($num == 1);
$text = $num . ' ' . $text;
}

View File

@@ -218,7 +218,7 @@ sub CreoleRule {
}
# escape next char (and prevent // in URLs from enabling italics)
# ~
elsif (m/\G(~($FullUrlPattern|\S))/cgo) {
elsif (m/\G(~($FullUrlPattern|\S))/cg) {
return
($CreoleTildeAlternative and
index( 'ABCDEFGHIJKLMNOPQRSTUVWXYZ'
@@ -234,12 +234,12 @@ sub CreoleRule {
# {{{preformatted code}}}
elsif (m/\G\{\{\{(.*?}*)\}\}\}/cg) { return $q->code($1); }
# download: {{pic}} and {{pic|text}}
elsif (m/\G(\{\{$FreeLinkPattern$CreoleLinkTextPattern\}\})/cgos) {
elsif (m/\G(\{\{$FreeLinkPattern$CreoleLinkTextPattern\}\})/cgs) {
my $text = $4 || $2;
return GetCreoleLinkHtml($1, GetDownloadLink(FreeToNormal($2), 1, undef, $text), $text);
}
# image link: {{url}} and {{url|text}}
elsif (m/\G\{\{$FullUrlPattern$CreoleLinkTextPattern\}\}/cgos) {
elsif (m/\G\{\{$FullUrlPattern$CreoleLinkTextPattern\}\}/cgs) {
return GetCreoleImageHtml(
$q->a({-href=> UnquoteHtml($1),
-class=> 'image outside'},
@@ -250,7 +250,7 @@ sub CreoleRule {
}
# image link: [[link|{{pic}}]] and [[link|{{pic|text}}]]
elsif (m/\G(\[\[$FreeLinkPattern$CreoleLinkPipePattern
\{\{$FreeLinkPattern$CreoleLinkTextPattern\}\}\]\])/cgosx) {
\{\{$FreeLinkPattern$CreoleLinkTextPattern\}\}\]\])/cgsx) {
my $text = $5 || $2;
return GetCreoleLinkHtml($1, GetCreoleImageHtml(
ScriptLink(UrlEncode(FreeToNormal($2)),
@@ -261,7 +261,7 @@ sub CreoleRule {
}
# image link: [[link|{{url}}]] and [[link|{{url|text}}]]
elsif (m/\G(\[\[$FreeLinkPattern$CreoleLinkPipePattern
\{\{$FullUrlPattern$CreoleLinkTextPattern\}\}\]\])/cgosx) {
\{\{$FullUrlPattern$CreoleLinkTextPattern\}\}\]\])/cgsx) {
my $text = $5 || $2;
return GetCreoleLinkHtml($1, GetCreoleImageHtml(
ScriptLink(UrlEncode(FreeToNormal($2)),
@@ -272,7 +272,7 @@ sub CreoleRule {
}
# image link: [[url|{{pic}}]] and [[url|{{pic|text}}]]
elsif (m/\G(\[\[$FullUrlPattern$CreoleLinkPipePattern
\{\{$FreeLinkPattern$CreoleLinkTextPattern\}\}\]\])/cgosx) {
\{\{$FreeLinkPattern$CreoleLinkTextPattern\}\}\]\])/cgsx) {
my $text = $5 || $2;
return GetCreoleLinkHtml($1, GetCreoleImageHtml(
$q->a({-href=> UnquoteHtml($2), -class=> 'image outside'},
@@ -283,7 +283,7 @@ sub CreoleRule {
}
# image link: [[url|{{url}}]] and [[url|{{url|text}}]]
elsif (m/\G\[\[$FullUrlPattern$CreoleLinkPipePattern
\{\{$FullUrlPattern$CreoleLinkTextPattern\}\}\]\]/cgosx) {
\{\{$FullUrlPattern$CreoleLinkTextPattern\}\}\]\]/cgsx) {
return GetCreoleImageHtml(
$q->a({-href=> UnquoteHtml($1), -class=> 'image outside'},
$q->img({-src=> UnquoteHtml($2),
@@ -292,7 +292,7 @@ sub CreoleRule {
-class=> 'url outside'})));
}
# link: [[url]] and [[url|text]]
elsif (m/\G\[\[$FullUrlPattern$CreoleLinkTextPattern\]\]/cgos) {
elsif (m/\G\[\[$FullUrlPattern$CreoleLinkTextPattern\]\]/cgs) {
# Permit embedding of Creole syntax within link text. (Rather complicated,
# but it does the job remarkably.)
my $link_url = $1;
@@ -305,7 +305,7 @@ sub CreoleRule {
return GetUrl($link_url, $link_text, 1);
}
# link: [[page]] and [[page|text]]
elsif (m/\G(\[\[$FreeLinkPattern$CreoleLinkTextPattern\]\])/cgos) {
elsif (m/\G(\[\[$FreeLinkPattern$CreoleLinkTextPattern\]\])/cgs) {
my $markup = $1;
my $page_name = $2;
my $link_text = $4 ? CreoleRuleRecursive($4, @_) : $page_name;
@@ -315,7 +315,7 @@ sub CreoleRule {
}
# interlink: [[Wiki:page]] and [[Wiki:page|text]]
elsif ($is_interlinking and
m/\G(\[\[$FreeInterLinkPattern$CreoleLinkTextPattern\]\])/cgos) {
m/\G(\[\[$FreeInterLinkPattern$CreoleLinkTextPattern\]\])/cgs) {
my $markup = $1;
my $interlink = $2;
my $interlink_text = $4;

View File

@@ -28,8 +28,8 @@ $RuleOrder{\&CrumbsRule} = -10; # run before default rules!
sub CrumbsRule {
if (not (pos) # first!
and (($WikiLinks && /\G($LinkPattern\n)/cgo)
or ($FreeLinks && /\G(\[\[$FreeLinkPattern\]\]\n)/cgo))) {
and (($WikiLinks && /\G($LinkPattern\n)/cg)
or ($FreeLinks && /\G(\[\[$FreeLinkPattern\]\]\n)/cg))) {
my $oldpos = pos; # will be trashed below
my $cluster = FreeToNormal($2);
my %seen = ($cluster => 1);

View File

@@ -133,14 +133,14 @@ sub DespamPage {
# from DoHistory()
my @revisions = sort {$b <=> $a} map { m|/([0-9]+).kp$|; $1; } GetKeepFiles($OpenPageName);
foreach my $revision (@revisions) {
my ($text, $rev) = GetTextRevision($revision, 1); # quiet
my ($revisionPage, $rev) = GetTextRevision($revision, 1); # quiet
if (not $rev) {
print ': ' . Ts('Cannot find revision %s.', $revision);
return;
} elsif (not DespamBannedContent($text)) {
} elsif (not DespamBannedContent($revisionPage->{text})) {
my $summary = Tss('Revert to revision %1: %2', $revision, $rule);
print ': ' . $summary;
Save($OpenPageName, $text, $summary) unless GetParam('debug', 0);
Save($OpenPageName, $revisionPage->{text}, $summary) unless GetParam('debug', 0);
return;
}
}

View File

@@ -30,7 +30,7 @@ $DojoTheme = 'tundra';
push (@MyRules, \&WysiwygRule);
sub WysiwygRule {
if (m/\G(&lt;.*?&gt;)/gc) {
if (m/\G(&lt;.*?&gt;)/cg) {
return $1 if substr($1,5,6) eq 'script'
or substr($1,4,6) eq 'script';
return UnquoteHtml($1);

View File

@@ -28,8 +28,8 @@ push( @MyRules, \&DownloadSupportRule );
# [[download:page name|alternate title]]
sub DownloadSupportRule {
if (m/\G(\[\[download:$FreeLinkPattern\|([^\]]+)\]\])/cog
or m!\G(\[\[download:$FreeLinkPattern\]\])!cog) {
if (m/\G(\[\[download:$FreeLinkPattern\|([^\]]+)\]\])/cg
or m!\G(\[\[download:$FreeLinkPattern\]\])!cg) {
Dirty($1);
print GetDownloadLink($2, undef, undef, $3);
return '';

View File

@@ -48,10 +48,10 @@ sub DoDraft {
SetParam('msg', T('Draft saved')); # invalidate cache
print GetHttpHeader('', T('Draft saved'), '204 NO CONTENT');
} elsif (-f $draft) {
my %data = ParseData(ReadFileOrDie($draft));
my $data = ParseData(ReadFileOrDie($draft));
unlink ($draft);
$Message .= $q->p(T('Draft recovered'));
DoEdit($data{id}, $data{text}, 1);
DoEdit($data->{id}, $data->{text}, 1);
} else {
ReportError(T('No draft available to recover'), '404 NOT FOUND');
}

View File

@@ -18,7 +18,7 @@ use v5.10;
AddModuleDescription('edit-cluster.pl', 'Edit Cluster Extension');
our ($q, $FS, $RcDefault, @RcDays, $RecentTop, $LastUpdate);
our ($q, $FS, $RcDefault, @RcDays, $RecentTop, $LastUpdate, $ShowAll);
our $EditCluster = 'EditCluster';
@@ -34,7 +34,7 @@ sub GetRc {
$changetime{$pagename} = $ts;
}
my $date = '';
my $all = GetParam('all', 0);
my $all = GetParam('all', $ShowAll);
my ($idOnly, $userOnly, $hostOnly, $clusterOnly, $filterOnly, $match, $lang) =
map { GetParam($_, ''); }
('rcidonly', 'rcuseronly', 'rchostonly', 'rcclusteronly',
@@ -128,7 +128,7 @@ sub EditClusterNewRcHeader {
$action = "action=rc$action";
}
my $days = GetParam('days', $RcDefault);
my $all = GetParam('all', 0);
my $all = GetParam('all', $ShowAll);
my @menu;
if ($all) {
push(@menu, ScriptLink("$action;days=$days;all=0",

View File

@@ -21,14 +21,14 @@ push(@MyRules, \&EmailQuoteRule);
sub EmailQuoteRule {
# > on a line of its own should work
if ($bol && m/\G(\s*\n)*((\&gt;))+\n/cog) {
if ($bol && m/\G(\s*\n)*((\&gt;))+\n/cg) {
return $q->p();
}
# > hi, you mentioned that:
# >> I don't like Oddmuse.
# > in last letter.
elsif ($bol && m/\G(\s*\n)*((\&gt;)+)[ \t]/cog
or InElement('dd') && m/\G(\s*\n)+((\&gt;)+)[ \t]/cog) {
elsif ($bol && m/\G(\s*\n)*((\&gt;)+)[ \t]/cg
or InElement('dd') && m/\G(\s*\n)+((\&gt;)+)[ \t]/cg) {
my $leng = length($2) / 4;
return CloseHtmlEnvironmentUntil('dd') . OpenHtmlEnvironment('dl',$leng, 'quote')
. $q->dt() . AddHtmlEnvironment('dd');

View File

@@ -28,7 +28,7 @@ push(@MyRules, \&EnclosureRule);
# [[enclosure:url|size in bytes|mime type]]
sub EnclosureRule {
if (m!\G\[\[enclosure:\s*$FreeLinkPattern(\|([^\]]+))?\]\]!ogci) {
if (m!\G\[\[enclosure:\s*$FreeLinkPattern(\|([^\]]+))?\]\]!cgi) {
my $id = FreeToNormal($1);
# Make sure we don't add duplicates; we will add non-existing
# enclosures as well. We test for existence only when the RSS feed
@@ -56,8 +56,8 @@ sub NewEnclosureRssItem {
my $id = shift;
my $rss = OldEnclosureRssItem($id, @_);
require MIME::Base64;
my %data = ParseData(ReadFileOrDie(GetPageFile($id)));
my @enclosures = split(' ', $data{enclosures});
my $data = ParseData(ReadFileOrDie(GetPageFile($id)));
my @enclosures = split(' ', $data->{enclosures});
my $enclosures = '';
foreach my $enclosure (@enclosures) {
# Don't add the enclosure if the page has been deleted in the mean

View File

@@ -38,10 +38,10 @@ $FaqAnswerText = "Answer: " unless $FaqAnswerText;
push(@MyRules, \&FaqRule);
sub FaqRule {
if ($bol && m/\GQ: (.+)/gc) {
if ($bol && m/\GQ: (.+)/cg) {
return $q->a({name=>'FAQ_' . UrlEncode($1)},'')
. $q->div({class=>'question'}, $FaqQuestionText . $1);
} elsif ($bol && m/\GA:[ \t]*/gc) {
} elsif ($bol && m/\GA:[ \t]*/cg) {
return CloseHtmlEnvironments()
. AddHtmlEnvironment('div', "class='answer'") . $FaqAnswerText;
}

View File

@@ -29,7 +29,7 @@ $FCKeditorHeight = 400; # Pixel
push (@MyRules, \&WysiwygRule);
sub WysiwygRule {
if (m/\G(&lt;.*?&gt;)/gc) {
if (m/\G(&lt;.*?&gt;)/cg) {
return $1 if substr($1,5,6) eq 'script'
or substr($1,4,6) eq 'script';
return UnquoteHtml($1);

View File

@@ -66,7 +66,7 @@ $RuleOrder{\&FlickrGalleryRule} = -10;
sub FlickrGalleryRule {
# This code is used when Markdown is not available
if (/\G^([\n\r]*\&lt;\s*FlickrSet:\s*(\d+)\s*\&gt;\s*)$/mgci) {
if (/\G^([\n\r]*\&lt;\s*FlickrSet:\s*(\d+)\s*\&gt;\s*)$/cgim) {
my $oldpos = pos;
my $oldstr = $_;
@@ -79,7 +79,7 @@ sub FlickrGalleryRule {
return '';
}
if (/\G^([\n\r]*\&lt;\s*FlickrPhoto:\s*(\d+)\s*([a-z0-9]*?)\s*($size)?\s*\&gt;\s*)$/mgci) {
if (/\G^([\n\r]*\&lt;\s*FlickrPhoto:\s*(\d+)\s*([a-z0-9]*?)\s*($size)?\s*\&gt;\s*)$/cgim) {
my $oldpos = pos;
my $oldstr = $_;
@@ -103,13 +103,13 @@ sub MarkdownFlickrGalleryRule {
^&lt;FlickrSet:\s*(\d+)\s*\>
}{
FlickrGallery($1);
}xmgei;
}egimx;
$text =~ s{
^&lt;FlickrPhoto:\s*(\d+)\s*([a-z0-9]*?)\s*($size)?\s*\>
}{
GetFlickrPhoto($1,$2,$3);
}xmgei;
}egimx;
return $text
}
@@ -135,7 +135,7 @@ sub FlickrGallery {
$result = $FlickrHeaderTemplate;
$result =~ s/(\$[a-zA-Z\d]+)/"defined $1 ? $1 : ''"/gee;
$result =~ s/(\$[a-zA-Z\d]+)/"defined $1 ? $1 : ''"/eeg;
# Get list of photos and process them
$url = $FlickrBaseUrl . "?method=flickr.photosets.getPhotos&api_key=" .
@@ -153,7 +153,7 @@ sub FlickrGallery {
my $footer = $FlickrFooterTemplate;
$footer =~ s/(\$[a-zA-Z\d]+)/"defined $1 ? $1 : ''"/gee;
$footer =~ s/(\$[a-zA-Z\d]+)/"defined $1 ? $1 : ''"/eeg;
$result .= $footer;
return $result;
@@ -192,7 +192,7 @@ sub FlickrPhoto {
my $output = $FlickrImageTemplate;
$output =~ s/(\$[a-zA-Z\d]+)/"defined $1 ? $1 : ''"/gee;
$output =~ s/(\$[a-zA-Z\d]+)/"defined $1 ? $1 : ''"/eeg;
return $output
}

View File

@@ -242,13 +242,13 @@ sub FootnotesRule {
# Footnotes and the set of all footnotes must be marked so as to ensure their
# reevaluation, as each of the footnotes might contain Wiki markup requiring
# reevaluation (like, say, free links).
if (m/\G($FootnotePattern)(?=([ \t]*$FootnotePattern)?)/gcos) {
if (m/\G($FootnotePattern)(?=([ \t]*$FootnotePattern)?)/cgs) {
Dirty($1); # do not cache the prefixing "\G"
my $footnote_text = $2;
my $is_adjacent_footnote = defined $3;
# A number range (e.g., "2-5") of references to other footnotes.
if ($footnote_text =~ m/^(\d+)-(\d+)$/o) {
if ($footnote_text =~ m/^(\d+)-(\d+)$/) {
my ($footnote_number_first, $footnote_number_last) = ($1, $2);
# '&#x2013;', below, is the HTML entity for a Unicode en-dash.
print $q->a({-href=> '#footnotes' .$footnote_number_first,
@@ -261,7 +261,7 @@ sub FootnotesRule {
}, $footnote_number_last.($is_adjacent_footnote ? ', ' : ''));
}
# A number (e.g., "5") implying reference to another footnote.
elsif ($footnote_text =~ m/^(\d+)$/o) {
elsif ($footnote_text =~ m/^(\d+)$/) {
my $footnote_number = $1;
print $q->a({-href=> '#footnotes' .$footnote_number,
-title=> 'Footnote #'.$footnote_number,
@@ -285,7 +285,7 @@ sub FootnotesRule {
return '';
}
# The "<footnotes>" list of all footnotes at the foot of a page.
elsif ($bol && m/\G($FootnotesPattern)/gcios) {
elsif ($bol && m/\G($FootnotesPattern)/cgis) {
Clean(CloseHtmlEnvironments());
Dirty($1); # do not cache the prefixing "\G"

View File

@@ -19,9 +19,9 @@ sub FormsRule {
my $oldpos = pos;
Clean(CloseHtmlEnvironments());
Dirty($form);
$form =~ s/\%([a-z]+)\%/GetParam($1)/ge;
$form =~ s/\%([a-z]+)\%/GetParam($1)/eg;
$form =~ s/\$([a-z]+)\$/$q->span({-class=>'param'}, GetParam($1))
.$q->input({-type=>'hidden', -name=>$1, -value=>GetParam($1)})/ge;
.$q->input({-type=>'hidden', -name=>$1, -value=>GetParam($1)})/eg;
print UnquoteHtml($form);
pos = $oldpos;
return AddHtmlEnvironment('p');

View File

@@ -240,9 +240,9 @@ sub GdSecurityImageCheck {
if ($answer ne '' && GdSecurityImageIsValidId($id)) {
my ($status, $data) = ReadFile(GdSecurityImageGetTicketFile($id));
if ($status) {
my %page = ParseData($data);
if ($page{generation_time} + $GdSecurityImageDuration > $Now) {
if ($answer eq $page{string}) {
my $page = ParseData($data);
if ($page->{generation_time} + $GdSecurityImageDuration > $Now) {
if ($answer eq $page->{string}) {
$GdSecurityImageId = '';
if (!$GdSecurityImageRememberAnswer) {
SetParam('gd_security_image_id', '');

View File

@@ -56,7 +56,7 @@ sub GooglePlusPrintFooter {
return q{
<!-- start of Google+ -->
<script type="text/javascript">
function loadScript(jssource,link_id) {
function loadScript(jssource) {
// add javascript
var jsnode = document.createElement('script');
jsnode.setAttribute('type','text/javascript');
@@ -66,15 +66,24 @@ function loadScript(jssource,link_id) {
var butn = document.createElement('div');
butn.setAttribute('class', 'g-plusone');
butn.setAttribute('id', 'my_plusone');
var link = document.getElementById(link_id);
var link = document.getElementById('plus1');
link.parentNode.insertBefore(butn, link);
// hide the link
link.innerHTML = "";
// when looking at action=plusone
var ul = document.getElementById('plus1s');
var children = ul.children;
for (var i = 0; i < children.length; i++) {
var li = children[i];
butn = document.createElement('g:plusone');
butn.setAttribute('href', li.firstElementChild.getAttribute('href'));
butn.setAttribute('id', 'my_plusone' + i);
li.appendChild(butn);
}
}
var plus1source = "https://apis.google.com/js/plusone.js";
</script>
<p id="plus1">
<a href="javascript:loadScript(plus1source,'plus1')">
<a href="javascript:loadScript('https://apis.google.com/js/plusone.js')">
<img src="/pics/plusone-h24.png" alt="Show Google +1" />
</a>
</p>
@@ -103,11 +112,9 @@ sub DoPlusOne {
push(@pages, $id) if $id =~ /^\d\d\d\d-\d\d-\d\d/;
}
splice(@pages, 0, $#pages - 19); # last 20 items
print "<ul>";
print '<ul id="plus1s">';
foreach my $id (@pages) {
my $url = ScriptUrl(UrlEncode($id));
print $q->li(GetPageLink($id),
qq{ <g:plusone href="$url"></g:plusone>});
print $q->li(GetPageLink($id), ' ');
}
print "</ul>";
print $q->end_div();

View File

@@ -39,7 +39,7 @@ sub GotobarInit {
@UserGotoBarPages = ();
$UserGotoBar = '';
my $count = 0;
while ($Page{text} =~ m/($LinkPattern|\[\[$FreeLinkPattern\]\]|\[\[$FreeLinkPattern\|([^\]]+)\]\]|\[$InterLinkPattern\s+([^\]]+?)\]|\[$FullUrlPattern[|[:space:]]([^\]]+?)\])/og) {
while ($Page{text} =~ m/($LinkPattern|\[\[$FreeLinkPattern\]\]|\[\[$FreeLinkPattern\|([^\]]+)\]\]|\[$InterLinkPattern\s+([^\]]+?)\]|\[$FullUrlPattern[|[:space:]]([^\]]+?)\])/g) {
my $page = $2||$3||$4||$6||$8;
my $text = $5||$7||$9;
$UserGotoBar .= ' ' if $UserGotoBar;

View File

@@ -29,7 +29,7 @@ my $gravatar_regexp = "\\[\\[gravatar:(?:$FullUrlPattern )?([^\n:]+):([0-9a-f]+)
push(@MyRules, \&GravatarRule);
sub GravatarRule {
if ($bol && m!\G$gravatar_regexp!cog) {
if ($bol && m!\G$gravatar_regexp!cg) {
my $url = $1;
my $gravatar = "https://secure.gravatar.com/avatar/$3";
my $name = FreeToNormal($2);
@@ -53,7 +53,7 @@ sub GravatarFormAddition {
return $html unless $type eq 'comment';
my $addition = $q->span({-class=>'mail'},
$q->label({-for=>'mail'}, T('Email: '))
$q->label({-for=>'mail'}, T('Email:') . ' ')
. ' ' . $q->textfield(-name=>'mail', -id=>'mail',
-default=>GetParam('mail', '')));
$html =~ s!(name="homepage".*?)</p>!$1 $addition</p>!i;
@@ -90,6 +90,6 @@ sub AddGravatar {
sub GravatarNewGetSummary {
my $summary = GravatarOldGetSummary(@_);
$summary =~ s/^$gravatar_regexp *//o;
$summary =~ s/^$gravatar_regexp *//;
return $summary;
}

View File

@@ -45,7 +45,7 @@ sub PrintGrep {
foreach my $id (AllPagesList()) {
my $text = GetPageContent($id);
next if (TextIsFile($text)); # skip files
while ($text =~ m{($regexp)}ig) {
while ($text =~ m{($regexp)}gi) {
print $q->li(GetPageLink($id) . ': ' . $1);
}
}

View File

@@ -41,7 +41,7 @@ $RuleOrder{\&HeadersRule} = 95;
sub HeadersRule {
my $oldpos = pos;
if ($bol && (m/\G((.+?)[ \t]*\n(---+|===+)[ \t]*\n)/gc)) {
if ($bol && (m/\G((.+?)[ \t]*\n(---+|===+)[ \t]*\n)/cg)) {
my $html = CloseHtmlEnvironments() . ($PortraitSupportColorDiv ? '</div>' : '');
if (substr($3,0,1) eq '=') {
$html .= $q->h2($2);

View File

@@ -32,7 +32,7 @@ push(@MyRules, \&HeadlinesRule);
$HeadlineNumber = 20;
sub HeadlinesRule {
if (m/\G(\&lt;headlines(:(\d+))?\&gt;)/gci) {
if (m/\G(\&lt;headlines(:(\d+))?\&gt;)/cgi) {
if (($3) and ($3>0)) {$HeadlineNumber = $3;};
Clean(CloseHtmlEnvironments());
Dirty($1);

View File

@@ -1212,7 +1212,7 @@ sub GetHibernalArchiveMonth {
~e;
$html_month =~ s~( {1,2})(\d{1,2})\b~
$1.GetHibernalArchiveMonthDay($post_name_regexp, $year, $month, $2)
~ge;
~eg;
# Float the HTML for each month horizontally past the month preceding it;
# failure to float months in this manner causes these months to stack

View File

@@ -32,7 +32,7 @@ push(@MyRules, \&ImageSupportRule);
sub ImageSupportRule {
my $result = undef;
if (m!\G\[\[image((/[a-z]+)*)( external)?:\s*([^]|]+?)\s*(\|[^]|]+?)?\s*(\|[^]|]*?)?\s*(\|[^]|]*?)?\s*(\|[^]|]*?)?\s*\]\](\{([^}]+)\})?!gc) {
if (m!\G\[\[image((/[a-z]+)*)( external)?:\s*([^]|]+?)\s*(\|[^]|]+?)?\s*(\|[^]|]*?)?\s*(\|[^]|]*?)?\s*(\|[^]|]*?)?\s*\]\](\{([^}]+)\})?!cg) {
my $oldpos = pos;
my $class = 'image' . $1;
my $external = $3;

View File

@@ -33,7 +33,7 @@ $IrcLinkNick = 0;
# This adds an extra <br> at the beginning. Alternatively, add it to
# the last line, or only add it when required.
sub IrcRule {
if ($bol && m/\G(?:\[?(\d\d?:\d\d(?:am|pm)?)\]?)?\s*&lt;($IrcNickRegexp)&gt; ?/gc) {
if ($bol && m/\G(?:\[?(\d\d?:\d\d(?:am|pm)?)\]?)?\s*&lt;($IrcNickRegexp)&gt; ?/cg) {
my ($time, $nick) = ($1, $2);
my ($error) = ValidId($nick);
# if we're in a dl, close the open dd but not the dl. (if we're

View File

@@ -149,7 +149,7 @@ sub JoinerGetPasswordHash {
sub JoinerRequestLockOrError {
my ($name) = @_;
# 10 tries, 3 second wait, die on error
return RequestLockDir($name, 0, 10, 3, 1);
return RequestLockDir($name, 10, 3, 1);
}
sub JoinerGetEmailFile {
@@ -174,18 +174,17 @@ sub JoinerCreateAccount {
}
my ($email_status, $email_data) = ReadFile(JoinerGetEmailFile($email));
my %email_page = ();
if ($email_status) {
%email_page = ParseData($email_data);
if ($email_page{confirmed}) {
my $email_page = ParseData($email_data);
if ($email_page->{confirmed}) {
return Ts('The email address %s has already been used.', $email);
}
if ($email_page{registration_time} + $JoinerWait > $Now) {
my $min = 1 + int(($email_page{registration_time} + $JoinerWait - $Now) / 60);
if ($email_page->{registration_time} + $JoinerWait > $Now) {
my $min = 1 + int(($email_page->{registration_time} + $JoinerWait - $Now) / 60);
return Ts('Wait %s minutes before try again.', $min);
}
}
%email_page = ();
my %email_page = ();
$email_page{username} = $username;
$email_page{email} = $email;
$email_page{confirmed} = 0;
@@ -215,7 +214,7 @@ sub JoinerSendRegistrationConfirmationEmail {
print $EMAIL "From: $JoinerEmailSenderAddress\n";
print $EMAIL "Subject: $SiteName " . T('Registration Confirmation') . "\n";
print $EMAIL "\n";
print $EMAIL T('Visit the link blow to confirm registration.') . "\n";
print $EMAIL T('Visit the link below to confirm registration.') . "\n";
print $EMAIL "\n";
print $EMAIL "$link\n";
print $EMAIL "\n";
@@ -468,37 +467,37 @@ sub JoinerDoConfirmRegistration {
JoinerShowRegistrationConfirmationFailed();
return;
}
my %page = ParseData($data);
my $page = ParseData($data);
if ($key ne $page{key}) {
if ($key ne $page->{key}) {
$JoinerMessage = T('Invalid key.');
JoinerShowRegistrationConfirmationFailed();
return;
}
if ($page{registration_time} + $JoinerWait < $Now) {
if ($page->{registration_time} + $JoinerWait < $Now) {
$JoinerMessage = T('The key expired.');
JoinerShowRegistrationConfirmationFailed();
return;
}
$page{key} = '';
$page{confirmed} = 1;
$page->{key} = '';
$page->{confirmed} = 1;
JoinerRequestLockOrError('joiner');
CreateDir($JoinerDir);
WriteStringToFile(JoinerGetAccountFile($username), EncodePage(%page));
WriteStringToFile(JoinerGetAccountFile($username), EncodePage(%$page));
ReleaseLockDir('joiner');
my $email = $page{email};
my $email = $page->{email};
JoinerRequestLockOrError('joiner');
my ($email_status, $email_data) = ReadFile(JoinerGetEmailFile($email));
ReleaseLockDir('joiner');
if ($email_status) {
my %email_page = ParseData($email_data);
$email_page{confirmed} = 1;
my $email_page = ParseData($email_data);
$email_page->{confirmed} = 1;
JoinerRequestLockOrError('joiner');
CreateDir($JoinerEmailDir);
WriteStringToFile(JoinerGetEmailFile($email), EncodePage(%email_page));
WriteStringToFile(JoinerGetEmailFile($email), EncodePage(%$email_page));
ReleaseLockDir('joiner');
}
@@ -570,41 +569,41 @@ sub JoinerDoProcessLogin {
JoinerDoLogin();
return;
}
my %page = ParseData($data);
my $page = ParseData($data);
my $hash = JoinerGetPasswordHash($password);
if ($hash eq $page{password}) {
$page{recover} = 0;
if ($hash eq $page->{password}) {
$page->{recover} = 0;
SetParam('joiner_recover', 0);
} elsif ($key ne '' && $key eq $page{recover_key}) {
if ($page{recover_time} + $JoinerWait < $Now) {
} elsif ($key ne '' && $key eq $page->{recover_key}) {
if ($page->{recover_time} + $JoinerWait < $Now) {
$JoinerMessage = T('The key expired.');
JoinerDoLogin();
return;
}
$page{recover} = 1;
$page->{recover} = 1;
SetParam('joiner_recover', 1);
} else {
$JoinerMessage = T('Login failed.');
JoinerDoLogin();
return;
}
if ($page{banned}) {
if ($page->{banned}) {
$JoinerMessage = T('You are banned.');
JoinerDoLogin();
return;
}
if (!$page{confirmed}) {
if (!$page->{confirmed}) {
$JoinerMessage = T('You must confirm email address.');
JoinerDoLogin();
return;
}
my $session = Digest::MD5::md5_hex(rand());
$page{session} = $session;
$page->{session} = $session;
JoinerRequestLockOrError('joiner');
CreateDir($JoinerDir);
WriteStringToFile(JoinerGetAccountFile($username), EncodePage(%page));
WriteStringToFile(JoinerGetAccountFile($username), EncodePage(%$page));
ReleaseLockDir('joiner');
SetParam('username', $username);
@@ -617,7 +616,7 @@ sub JoinerDoProcessLogin {
print Ts('%s has logged in.', $username);
print $q->end_p();
if ($page{recover}) {
if ($page->{recover}) {
print $q->start_p();
print T('You should set new password immediately.');
print $q->end_p();
@@ -735,9 +734,9 @@ sub JoinerDoProcessChangePassword {
JoinerDoChangePassword();
return;
}
my %page = ParseData($data);
my $page = ParseData($data);
my $hash = JoinerGetPasswordHash($current_password);
if (!$page{recover} && $hash ne $page{password}) {
if (!$page->{recover} && $hash ne $page->{password}) {
$JoinerMessage = T('Current Password:') . ' ' . T('Password is wrong.');
JoinerDoChangePassword();
return;
@@ -754,12 +753,12 @@ sub JoinerDoProcessChangePassword {
return;
}
$page{password} = JoinerGetPasswordHash($new_password);
$page{key} = '';
$page{recover} = '';
$page->{password} = JoinerGetPasswordHash($new_password);
$page->{key} = '';
$page->{recover} = '';
JoinerRequestLockOrError('joiner');
CreateDir($JoinerDir);
WriteStringToFile(JoinerGetAccountFile($username), EncodePage(%page));
WriteStringToFile(JoinerGetAccountFile($username), EncodePage(%$page));
ReleaseLockDir('joiner');
SetParam('joiner_recover', 0);
@@ -823,9 +822,9 @@ sub JoinerDoProcessForgotPassword {
JoinerDoForgotPassword();
return;
}
my %email_page = ParseData($email_data);
my $email_page = ParseData($email_data);
my $username = $email_page{username};
my $username = $email_page->{username};
JoinerRequestLockOrError('joiner');
my ($status, $data) = ReadFile(JoinerGetAccountFile($username));
ReleaseLockDir('joiner');
@@ -834,27 +833,27 @@ sub JoinerDoProcessForgotPassword {
JoinerDoForgotPassword();
return;
}
my %page = ParseData($data);
my $page = ParseData($data);
if ($email ne $page{email}) {
if ($email ne $page->{email}) {
$JoinerMessage = T('The mail address is not valid anymore.');
JoinerDoForgotPassword();
return;
}
if ($page{recover_time} + $JoinerWait > $Now) {
my $min = 1 + int(($page{recover_time} + $JoinerWait - $Now) / 60);
if ($page->{recover_time} + $JoinerWait > $Now) {
my $min = 1 + int(($page->{recover_time} + $JoinerWait - $Now) / 60);
$JoinerMessage = Ts('Wait %s minutes before try again.', $min);
JoinerDoForgotPassword();
return;
}
my $key = Digest::MD5::md5_hex($JoinerGeneratorSalt . rand());
$page{recover_time} = $Now;
$page{recover_key} = $key;
$page->{recover_time} = $Now;
$page->{recover_key} = $key;
JoinerRequestLockOrError('joiner');
CreateDir($JoinerDir);
WriteStringToFile(JoinerGetAccountFile($username), EncodePage(%page));
WriteStringToFile(JoinerGetAccountFile($username), EncodePage(%$page));
ReleaseLockDir('joiner');
JoinerSendRecoverAccountEmail($email, $username, $key);
@@ -922,8 +921,8 @@ sub JoinerDoProcessChangeEmail {
my ($email_status, $email_data) = ReadFile(JoinerGetEmailFile($email));
ReleaseLockDir('joiner');
if ($email_status) {
my %email_page = ParseData($email_data);
if ($email_page{confirmed} && $email_page{username} ne $username) {
my $email_page = ParseData($email_data);
if ($email_page->{confirmed} && $email_page->{username} ne $username) {
$JoinerMessage = T('Email:') . ' ' .
Ts('The email address %s has already been used.', $email);
JoinerDoChangeEmail();
@@ -939,29 +938,29 @@ sub JoinerDoProcessChangeEmail {
JoinerDoChangeEmail();
return;
}
my %page = ParseData($data);
my $page = ParseData($data);
if ($page{change_email_time} + $JoinerWait > $Now) {
my $min = 1 + int(($page{change_email_time} + $JoinerWait - $Now) / 60);
if ($page->{change_email_time} + $JoinerWait > $Now) {
my $min = 1 + int(($page->{change_email_time} + $JoinerWait - $Now) / 60);
$JoinerMessage = Ts('Wait %s minutes before try again.', $min);
JoinerDoChangeEmail();
return;
}
my $hash = JoinerGetPasswordHash($password);
if ($hash ne $page{password}) {
if ($hash ne $page->{password}) {
$JoinerMessage = T('Password:') . ' ' . T('Password is wrong.');
JoinerDoChangeEmail();
return;
}
my $key = Digest::MD5::md5_hex(rand());
$page{change_email} = $email;
$page{change_email_key} = $key;
$page{change_email_time} = $Now;
$page->{change_email} = $email;
$page->{change_email_key} = $key;
$page->{change_email_time} = $Now;
JoinerRequestLockOrError('joiner');
CreateDir($JoinerDir);
WriteStringToFile(JoinerGetAccountFile($username), EncodePage(%page));
WriteStringToFile(JoinerGetAccountFile($username), EncodePage(%$page));
ReleaseLockDir('joiner');
JoinerSendChangeEmailEmail($email, $username, $key);
@@ -1012,22 +1011,22 @@ sub JoinerDoConfirmEmail {
JoinerShowConfirmEmailFailed();
return;
}
my %page = ParseData($data);
my $page = ParseData($data);
if ($key ne $page{change_email_key}) {
if ($key ne $page->{change_email_key}) {
$JoinerMessage = T('Invalid key.');
JoinerShowConfirmEmailFailed();
return;
}
my $new_email = $page{change_email};
$page{email} = $new_email;
$page{change_email} = '';
$page{change_email_key} = '';
$page{change_email_time} = '';
my $new_email = $page->{change_email};
$page->{email} = $new_email;
$page->{change_email} = '';
$page->{change_email_key} = '';
$page->{change_email_time} = '';
JoinerRequestLockOrError('joiner');
CreateDir($JoinerDir);
WriteStringToFile(JoinerGetAccountFile($username), EncodePage(%page));
WriteStringToFile(JoinerGetAccountFile($username), EncodePage(%$page));
ReleaseLockDir('joiner');
my %email_page = ();
@@ -1128,30 +1127,30 @@ sub JoinerDoProcessBan {
JoinerDoBan();
return;
}
my %page = ParseData($data);
my $page = ParseData($data);
if ($ban) {
if ($page{banned}) {
if ($page->{banned}) {
$JoinerMessage = Ts('%s is already banned.', $username);
JoinerDoBan();
return;
}
$page{banned} = 1;
$page{session} = '';
$page->{banned} = 1;
$page->{session} = '';
$JoinerMessage = Ts('%s has been banned.', $username);
} else {
if (!$page{banned}) {
if (!$page->{banned}) {
$JoinerMessage = Ts('%s is not banned.', $username);
JoinerDoBan();
return;
}
$page{banned} = 0;
$page->{banned} = 0;
$JoinerMessage = Ts('%s has been unbanned.', $username);
}
JoinerRequestLockOrError('joiner');
CreateDir($JoinerDir);
WriteStringToFile(JoinerGetAccountFile($username), EncodePage(%page));
WriteStringToFile(JoinerGetAccountFile($username), EncodePage(%$page));
ReleaseLockDir('joiner');
JoinerDoBan();
@@ -1178,16 +1177,16 @@ sub JoinerIsLoggedIn {
$JoinerLoggedIn = 0;
return $JoinerLoggedIn;
}
my %page = ParseData($data);
if (!$page{confirmed}) {
my $page = ParseData($data);
if (!$page->{confirmed}) {
$JoinerLoggedIn = 0;
return $JoinerLoggedIn;
}
if ($session ne $page{session}) {
if ($session ne $page->{session}) {
$JoinerLoggedIn = 0;
return $JoinerLoggedIn;
}
if ($page{banned}) {
if ($page->{banned}) {
$JoinerLoggedIn = 0;
return $JoinerLoggedIn;
}

View File

@@ -33,7 +33,7 @@ our ($q, @HtmlStack, @MyRules, $FullUrl);
push(@MyRules, \&LangRule);
sub LangRule {
if (m/\G\[([a-z][a-z])\]/gc) {
if (m/\G\[([a-z][a-z])\]/cg) {
my $html;
$html .= "</" . shift(@HtmlStack) . ">" if $HtmlStack[0] eq 'span';
return $html . AddHtmlEnvironment('span', "lang=\"$1\"") . "[$1]";

View File

@@ -98,7 +98,7 @@ EOT
push(@MyRules, \&LatexRule);
sub LatexRule {
if (m/\G\\\[(\(.*?\))?((.*\n)*?.*?)\\\]/gc) {
if (m/\G\\\[(\(.*?\))?((.*\n)*?.*?)\\\]/cg) {
my $label = $1;
my $latex = $2;
$label =~ s#\(?\)?##g;# Remove the ()'s from the label and convert case
@@ -106,13 +106,13 @@ sub LatexRule {
$eqCounter++;
$eqHash{$label} = $eqCounter;
return &MakeLaTeX("\\begin{displaymath} $latex \\end{displaymath}", "display math",$label);
} elsif (m/\G\$\$((.*\n)*?.*?)\$\$/gc) {
} elsif (m/\G\$\$((.*\n)*?.*?)\$\$/cg) {
return &MakeLaTeX("\$\$ $1 \$\$", $LatexSingleDollars ? "display math" : "inline math");
} elsif ($LatexSingleDollars and m/\G\$((.*\n)*?.*?)\$/gc) {
} elsif ($LatexSingleDollars and m/\G\$((.*\n)*?.*?)\$/cg) {
return &MakeLaTeX("\$ $1 \$", "inline math");
} elsif ($allowPlainLaTeX && m/\G\$\[((.*\n)*?.*?)\]\$/gc) { #Pick up plain LaTeX commands
} elsif ($allowPlainLaTeX && m/\G\$\[((.*\n)*?.*?)\]\$/cg) { #Pick up plain LaTeX commands
return &MakeLaTeX(" $1 ", "LaTeX");
} elsif (m/\GEQ\((.*?)\)/gc) { # Handle references to equations
} elsif (m/\GEQ\((.*?)\)/cg) { # Handle references to equations
my $label = $1;
$label =~ tr/A-Z/a-z/;
if ($eqHash{$label}) {
@@ -166,7 +166,7 @@ sub MakeLaTeX {
close $F;
}
my $template = ReadFileOrDie($LatexDefaultTemplateName);
$template =~ s/<math>/$latex/ig;
$template =~ s/<math>/$latex/gi;
#setup rendering directory
my $dir = "$LatexDir/$hash";
if (-d $dir) {

View File

@@ -23,7 +23,7 @@ push(@MyRules, \&LinkAllRule);
$RuleOrder{\&LinkAllRule} = 1000;
sub LinkAllRule {
if (/\G([A-Za-z\x{0080}-\x{fffd}]+)/gc) {
if (/\G([A-Za-z\x{0080}-\x{fffd}]+)/cg) {
my $oldpos = pos;
Dirty($1);
# print the word, or the link to the word

View File

@@ -71,31 +71,31 @@ sub GetLinkList { # for the currently open page
my %links;
foreach my $block (@blocks) {
if (shift(@flags)) { # dirty block and interlinks or normal links
if ($inter and ($BracketText && $block =~ m/^(\[$InterLinkPattern\s+([^\]]+?)\])$/o
or $BracketText && $block =~ m/^(\[\[$FreeInterLinkPattern\|([^\]]+?)\]\])$/o
or $block =~ m/^(\[$InterLinkPattern\])$/o
or $block =~ m/^(\[\[\[$FreeInterLinkPattern\]\]\])$/o
or $block =~ m/^($InterLinkPattern)$/o
or $block =~ m/^(\[\[$FreeInterLinkPattern\]\])$/o)) {
if ($inter and ($BracketText && $block =~ m/^(\[$InterLinkPattern\s+([^\]]+?)\])$/
or $BracketText && $block =~ m/^(\[\[$FreeInterLinkPattern\|([^\]]+?)\]\])$/
or $block =~ m/^(\[$InterLinkPattern\])$/
or $block =~ m/^(\[\[\[$FreeInterLinkPattern\]\]\])$/
or $block =~ m/^($InterLinkPattern)$/
or $block =~ m/^(\[\[$FreeInterLinkPattern\]\])$/)) {
$links{$raw ? $2 : GetInterLink($2, $3)} = 1 if $InterSite{substr($2,0,index($2, ':'))};
} elsif ($link
and (($WikiLinks and $block !~ m/!$LinkPattern/o
and ($BracketWiki && $block =~ m/^(\[$LinkPattern\s+([^\]]+?)\])$/o
or $block =~ m/^(\[$LinkPattern\])$/o
or $block =~ m/^($LinkPattern)$/o))
and (($WikiLinks and $block !~ m/!$LinkPattern/
and ($BracketWiki && $block =~ m/^(\[$LinkPattern\s+([^\]]+?)\])$/
or $block =~ m/^(\[$LinkPattern\])$/
or $block =~ m/^($LinkPattern)$/))
or ($FreeLinks
and ($BracketWiki && $block =~ m/^(\[\[$FreeLinkPattern\|([^\]]+)\]\])$/o
or $block =~ m/^(\[\[\[$FreeLinkPattern\]\]\])$/o
or $block =~ m/^(\[\[$FreeLinkPattern\]\])$/o)))) {
and ($BracketWiki && $block =~ m/^(\[\[$FreeLinkPattern\|([^\]]+)\]\])$/
or $block =~ m/^(\[\[\[$FreeLinkPattern\]\]\])$/
or $block =~ m/^(\[\[$FreeLinkPattern\]\])$/)))) {
$links{$raw ? FreeToNormal($2) : GetPageOrEditLink($2, $3)} = 1;
} elsif ($url and $block =~ m/^\[$FullUrlPattern\]$/og) {
} elsif ($url and $block =~ m/^\[$FullUrlPattern\]$/g) {
$links{$raw ? $1 : GetUrl($1)} = 1;
}
} elsif ($url) { # clean block and url
while ($block =~ m/$UrlPattern/og) {
while ($block =~ m/$UrlPattern/g) {
$links{$raw ? $1 : GetUrl($1)} = 1;
}
while ($block =~ m/\[$FullUrlPattern\s+[^\]]+?\]/og) {
while ($block =~ m/\[$FullUrlPattern\s+[^\]]+?\]/g) {
$links{$raw ? $1 : GetUrl($1)} = 1;
}
}

View File

@@ -52,7 +52,7 @@ push (@MyRules, \&LinkTagRule, \&LinkDescriptionRule);
sub LinkTagRule { # Process link tags on a page
if ( m/\G$LinkTagMark(.*?)$LinkTagMark/gc) { # find tags
if ( m/\G$LinkTagMark(.*?)$LinkTagMark/cg) { # find tags
my @linktags = split /,\s*/, $1; # push them in array
@linktags = map { # and generate html output:
qq{<a href="$ScriptName?action=linktagsearch;linktag=$_">$_</a>}; # each tag is a link to search all links with that tag
@@ -66,7 +66,7 @@ sub LinkTagRule { # Process link tags on a page
sub LinkDescriptionRule { # Process link descriptions on a page
if ( m/\G$LinkDescMark(.*?)$LinkDescMark/gc) { # find description
if ( m/\G$LinkDescMark(.*?)$LinkDescMark/cg) { # find description
return qq{<span class="$LinkDescClass">$1</span>}; # put it in SPAN block
}
return;
@@ -184,7 +184,7 @@ sub PrintLinkTagMap {
my $tag = $1;
"<li id=\"$tag\">$tag</li>\n<ul>";
}xsge;
}egsx;
$result =~ s/\<\/tag\>/<\/ul>/g;
$result =~ s{
@@ -194,7 +194,7 @@ sub PrintLinkTagMap {
my $name = $2; if ( length $name == 0 ) { $name = $url; } # name (if not present use url instead)
my $description = $3; # and description
"<li><a href=\"$url\">$name</a> <span class=\"$LinkDescClass\">$description</span></li>";
}xsge;
}egsx;
print $result;
}

View File

@@ -39,7 +39,7 @@ sub DoListBannedContent {
print $BannedRegexps . ': ' . scalar(keys(%text_regexps)) . $q->br() . "\n";
PAGE: foreach my $id (@pages) {
OpenPage($id);
my @urls = $Page{text} =~ /$FullUrlPattern/go;
my @urls = $Page{text} =~ /$FullUrlPattern/g;
foreach my $url (@urls) {
foreach my $re (keys %url_regexps) {
if ($url =~ $re) {

View File

@@ -31,7 +31,7 @@ $TagListLabel = "tag:";
push(@MyRules, \&ListTagRule);
sub ListTagRule {
if ($bol && /\G(\[\[\!tag\s*(.+)\]\])/gc) {
if ($bol && /\G(\[\[\!tag\s*(.+)\]\])/cg) {
my $tag_text = $2;
my @tags = split /,\s*/, $tag_text;
@tags = map {

View File

@@ -26,7 +26,7 @@ our ($q, $bol, @MyRules, $FreeLinkPattern);
push(@MyRules, \&LiveTemplateRule);
sub LiveTemplateRule {
if ($bol and /\G(&lt;&lt;$FreeLinkPattern\n)/cog) {
if ($bol and /\G(&lt;&lt;$FreeLinkPattern\n)/cg) {
Clean(CloseHtmlEnvironments());
my $str = $1;
my $template = FreeToNormal($2);
@@ -35,12 +35,12 @@ sub LiveTemplateRule {
Dirty($str);
my $oldpos = pos;
my $old_ = $_;
my %hash = ParseData($2);
my $hash = ParseData($2);
my $text = GetPageContent($template);
return $q->p($q->strong(Ts('The template %s is either empty or does not exist.',
$template))) . AddHtmlEnvironment('p') unless $text;
foreach my $key (keys %hash) {
$text =~ s/\$$key\$/$hash{$key}/g;
foreach my $key (keys %$hash) {
$text =~ s/\$$key\$/$hash->{$key}/g;
}
print "<div class=\"template $template\">";
ApplyRules(QuoteHtml($text), 1, 1, undef, 'p');

View File

@@ -18,32 +18,38 @@ use v5.10;
AddModuleDescription('load-lang.pl', 'Language Browser Preferences');
our ($q, %CookieParameters, $ConfigFile, $DataDir, $NamespaceCurrent, @MyInitVariables);
our ($CurrentLanguage, $LoadLanguageDir);
our ($q, %CookieParameters, $ConfigFile, $DataDir, $ModuleDir, $NamespaceCurrent, @MyInitVariables);
our $CurrentLanguage;
our $LoadLanguageDir = "$ModuleDir/translations"; # by default same as in git
$CookieParameters{interface} = '';
my %library= ('bg' => 'bulgarian-utf8.pl',
'de' => 'german-utf8.pl',
'es' => 'spanish-utf8.pl',
'fr' => 'french-utf8.pl',
'fi' => 'finnish-utf8.pl',
'gr' => 'greek-utf8.pl',
'he' => 'hebrew-utf8.pl',
'it' => 'italian-utf8.pl',
'ja' => 'japanese-utf8.pl',
'ko' => 'korean-utf8.pl',
'nl' => 'dutch-utf8.pl',
'pl' => 'polish-utf8.pl',
'pt' => 'portuguese-utf8.pl',
'ro' => 'romanian-utf8.pl',
'ru' => 'russian-utf8.pl',
'se' => 'swedish-utf8.pl',
'sr' => 'serbian-utf8.pl',
'zh' => 'chinese-utf8.pl',
'zh-cn' => 'chinese_cn-utf8.pl',
'zh-tw' => 'chinese-utf8.pl',
);
our %TranslationsLibrary = (
'bg' => 'bulgarian-utf8.pl',
'ca' => 'catalan-utf8.pl',
'de' => 'german-utf8.pl',
'es' => 'spanish-utf8.pl',
'fi' => 'finnish-utf8.pl',
'fr' => 'french-utf8.pl',
'gr' => 'greek-utf8.pl',
'he' => 'hebrew-utf8.pl',
'it' => 'italian-utf8.pl',
'ja' => 'japanese-utf8.pl',
'ko' => 'korean-utf8.pl',
'nl' => 'dutch-utf8.pl',
'pl' => 'polish-utf8.pl',
'pt' => 'portuguese-utf8.pl',
'pt-br' => 'brazilian-portuguese-utf8.pl',
'ro' => 'romanian-utf8.pl',
'ru' => 'russian-utf8.pl',
'se' => 'swedish-utf8.pl',
'sr' => 'serbian-utf8.pl',
'uk' => 'ukrainian-utf8.pl',
'zh' => 'chinese-utf8.pl',
'zh-cn' => 'chinese_cn-utf8.pl',
'zh-tw' => 'chinese-utf8.pl',
);
sub LoadLanguage {
# my $requested_language = "da, en-gb;q=0.8, en;q=0.7";
@@ -65,7 +71,7 @@ sub LoadLanguage {
# . $q->end_html) && exit if GetParam('debug', '');
foreach (@prefs) {
last if $Lang{$_} eq 'en'; # the default
my $file = $library{$Lang{$_}};
my $file = $TranslationsLibrary{$Lang{$_}};
$file = "$LoadLanguageDir/$file" if defined $LoadLanguageDir;
if (-r $file) {
do $file;

View File

@@ -103,7 +103,7 @@ L<http://ln.taoriver.net/>.
push(@MyRules, \&LocalNamesRule);
sub LocalNamesRule {
if (m/\G\[\[ln:$FullUrlPattern\s*([^\]]*)\]\]/cog) {
if (m/\G\[\[ln:$FullUrlPattern\s*([^\]]*)\]\]/cg) {
# [[ln:url text]], [[ln:url]]
return $q->a({-class=>'url outside ln', -href=>$1}, $2||$1);
}
@@ -144,7 +144,7 @@ sub LocalNamesInit {
$LocalNamesPage = FreeToNormal($LocalNamesPage); # spaces to underscores
$AdminPages{$LocalNamesPage} = 1;
my $data = GetPageContent($LocalNamesPage);
while ($data =~ m/\[$FullUrlPattern\s+([^\]]+?)\]/go) {
while ($data =~ m/\[$FullUrlPattern\s+([^\]]+?)\]/g) {
my ($page, $url) = ($2, $1);
my $id = FreeToNormal($page);
$LocalNames{$id} = $url;
@@ -152,7 +152,7 @@ sub LocalNamesInit {
# Now read data from ln links, checking cache if possible. For all
# URLs not in the cache or with invalid cache, fetch the file again,
# and save it in the cache.
my @ln = $data =~ m/\[\[ln:$FullUrlPattern[^\]]*?\]\]/go;
my @ln = $data =~ m/\[\[ln:$FullUrlPattern[^\]]*?\]\]/g;
my %todo = map {$_, GetLnFile($_)} @ln;
my %data = ();
if (GetParam('cache', $UseCache) > 0) {
@@ -347,13 +347,13 @@ sub DoLocalNames {
if (GetParam('expand', 0)) {
print "# Local names defined on $LocalNamesPage:\n";
my $data = GetPageContent($LocalNamesPage);
while ($data =~ m/\[$FullUrlPattern\s+([^\]]+?)\]/go) {
while ($data =~ m/\[$FullUrlPattern\s+([^\]]+?)\]/g) {
my ($title, $url) = ($2, $1);
my $id = FreeToNormal($title);
print qq{LN "$title" "$url"\n};
}
print "# Namespace delegations defined on $LocalNamesPage:\n";
while ($data =~ m/\[\[ln:$FullUrlPattern([^\]]*)?\]\]/go) {
while ($data =~ m/\[\[ln:$FullUrlPattern([^\]]*)?\]\]/g) {
my ($title, $url) = ($2, $1);
my $id = FreeToNormal($title);
print qq{NS "$title" "$url"\n};
@@ -396,10 +396,10 @@ sub DoDefine {
$q->start_div({-class=>'content define'}),
GetFormStart(undef, 'get', 'def');
my $go = T('Go!');
print $q->p($q->label({-for=>"defined"}, T('Name: ')),
print $q->p($q->label({-for=>"defined"}, T('Name:') . ' '),
$q->textfield(-name=>"name", -id=>"defined",
-tabindex=>"1", -size=>20));
print $q->p($q->label({-for=>"definition"}, T('URL: ')),
print $q->p($q->label({-for=>"definition"}, T('URL:') . ' '),
$q->textfield(-name=>"link", -id=>"definition",
-tabindex=>"2", -size=>20));
print $q->p($q->submit(-label=>$go, -tabindex=>"3"),
@@ -430,7 +430,7 @@ sub GetWantedPages {
# skip comment pages
if ($CommentsPrefix) {
foreach my $id (keys %WantedPages) {
delete $WantedPages{$id} if $id =~ /^$CommentsPrefix/o; # TODO use $CommentsPattern ?
delete $WantedPages{$id} if $id =~ /^$CommentsPrefix/; # TODO use $CommentsPattern ?
}
}
# now something more complicated: if near-links.pl was loaded, then
@@ -446,7 +446,7 @@ sub GetWantedPages {
# if any wanted pages remain, print them
if (@wanted) {
return $q->div({-class=>'definition'},
$q->p(T('Define external redirect: '),
$q->p(T('Define external redirect:'), ' ',
map { my $page = NormalToFree($_);
ScriptLink('action=define;name='
. UrlEncode($page),

View File

@@ -199,9 +199,9 @@ sub DoRegister {
my $id = shift;
print GetHeader('', Ts('Register for %s', $SiteName), '');
print '<div class="content">';
$RegistrationForm =~ s/\%([a-z]+)\%/GetParam($1)/ige;
$RegistrationForm =~ s/\%([a-z]+)\%/GetParam($1)/egi;
$RegistrationForm =~ s/\$([a-z]+)\$/$q->span({-class=>'param'}, GetParam($1))
. $q->input({-type=>'hidden', -name=>$1, -value=>GetParam($1)})/ge;
. $q->input({-type=>'hidden', -name=>$1, -value=>GetParam($1)})/eg;
print $RegistrationForm;
print '</div>';
PrintFooter();
@@ -271,9 +271,9 @@ sub DoLogin {
my $id = shift;
print GetHeader('', Ts('Login to %s', $SiteName), '');
print '<div class="content">';
$LoginForm =~ s/\%([a-z]+)\%/GetParam($1)/ge;
$LoginForm =~ s/\%([a-z]+)\%/GetParam($1)/eg;
$LoginForm =~ s/\$([a-z]+)\$/$q->span({-class=>'param'}, GetParam($1))
. $q->input({-type=>'hidden', -name=>$1, -value=>GetParam($1)})/ge;
. $q->input({-type=>'hidden', -name=>$1, -value=>GetParam($1)})/eg;
print $LoginForm;
print '</div>';
PrintFooter();
@@ -305,9 +305,9 @@ sub DoLogout {
print GetHeader('', Ts('Logout of %s', $SiteName), '');
print '<div class="content">';
print '<p>' . Ts('Logout of %s?',$SiteName) . '</p>';
$LogoutForm =~ s/\%([a-z]+)\%/GetParam($1)/ge;
$LogoutForm =~ s/\%([a-z]+)\%/GetParam($1)/eg;
$LogoutForm =~ s/\$([a-z]+)\$/$q->span({-class=>'param'}, GetParam($1))
. $q->input({-type=>'hidden', -name=>$1, -value=>GetParam($1)})/ge;
. $q->input({-type=>'hidden', -name=>$1, -value=>GetParam($1)})/eg;
print $LogoutForm;
print '</div>';
PrintFooter();
@@ -628,9 +628,9 @@ sub DoReset {
print GetHeader('', Ts('Reset Password for %s', $SiteName), '');
print '<div class="content">';
print '<p>' . T('Reset Password?') . '</p>';
$ResetForm =~ s/\%([a-z]+)\%/GetParam($1)/ge;
$ResetForm =~ s/\%([a-z]+)\%/GetParam($1)/eg;
$ResetForm =~ s/\$([a-z]+)\$/$q->span({-class=>'param'}, GetParam($1))
. $q->input({-type=>'hidden', -name=>$1, -value=>GetParam($1)})/ge;
. $q->input({-type=>'hidden', -name=>$1, -value=>GetParam($1)})/eg;
print $ResetForm;
print '</div>';
PrintFooter();
@@ -652,9 +652,9 @@ sub DoChangePassword {
print GetHeader('', Ts('Change Password for %s', $SiteName), '');
print '<div class="content">';
print '<p>' . T('Change Password?') . '</p>';
$ChangePassForm =~ s/\%([a-z]+)\%/GetParam($1)/ge;
$ChangePassForm =~ s/\%([a-z]+)\%/GetParam($1)/eg;
$ChangePassForm =~ s/\$([a-z]+)\$/$q->span({-class=>'param'}, GetParam($1))
. $q->input({-type=>'hidden', -name=>$1, -value=>GetParam($1)})/ge;
. $q->input({-type=>'hidden', -name=>$1, -value=>GetParam($1)})/eg;
print $ChangePassForm;
print '</div>';
PrintFooter();

View File

@@ -118,7 +118,7 @@ sub DoLogout {
print
GetHeader('', Ts('Logged out of %s', $SiteName), '') .
$q->div({-class=> 'content'}, $q->p(T('You are now logged out.'), $id ? $q->p(ScriptLink('action=browse;id=' . UrlEncode($id) . '&time=' . time, T('Return to ' . NormalToFree($id)))) : ''));
$q->div({-class=> 'content'}, $q->p(T('You are now logged out.'), $id ? $q->p(ScriptLink('action=browse;id=' . UrlEncode($id) . '&time=' . time, Ts('Return to %s', NormalToFree($id)))) : ''));
PrintFooter();
}
@@ -201,10 +201,10 @@ sub CookieUsernameFix {
# Only valid usernames get stored in the new cookie.
my $name = GetParam('username', '');
if (!$name) { }
elsif (!$FreeLinks && !($name =~ /^$LinkPattern$/o)) {
elsif (!$FreeLinks && !($name =~ /^$LinkPattern$/)) {
CookieUsernameFixDelete(Ts('Invalid UserName %s: not saved.', $name));
}
elsif ($FreeLinks && (!($name =~ /^$FreeLinkPattern$/o))) {
elsif ($FreeLinks && (!($name =~ /^$FreeLinkPattern$/))) {
CookieUsernameFixDelete(Ts('Invalid UserName %s: not saved.', $name));
}
elsif (length($name) > 50) { # Too long

View File

@@ -84,7 +84,7 @@ sub MailNewInitCookie {
$q->delete('mail');
if (!$mail) {
# do nothing
} elsif (!($mail =~ /$MailPattern/o)) {
} elsif (!($mail =~ /$MailPattern/)) {
$Message .= $q->p(Ts('Invalid Mail %s: not saved.', $mail));
} else {
SetParam('mail', $mail);
@@ -106,7 +106,7 @@ sub MailFormAddition {
. ScriptLink("action=subscribe;pages=$id", T('subscribe'), 'subscribe');
}
$addition = $q->span({-class=>'mail'},
$q->label({-for=>'mail'}, T('Email: '))
$q->label({-for=>'mail'}, T('Email:') . ' ')
. ' ' . $q->textfield(-name=>'mail', -id=>'mail',
-default=>GetParam('mail', ''))
. $addition);
@@ -311,8 +311,8 @@ sub DoMailSubscriptionList {
if ($raw) {
print join(' ', $key, @values) . "\n";
} else {
print $q->li(Ts('%s: ', MailLink($key, @values)),
join(' ', map { MailLink($_, $key) } @values));
print $q->li(Ts('%s:', MailLink($key, @values)) . ' '
. join(' ', map { MailLink($_, $key) } @values));
}
}
print '</ul></div>' unless $raw;

View File

@@ -45,14 +45,14 @@ sub MarkdownRule {
. AddHtmlEnvironment("p");
}
# setext headers
elsif ($bol and m/\G((\s*\n)*(.+?)[ \t]*\n(-+|=+)[ \t]*\n)/gc) {
elsif ($bol and m/\G((\s*\n)*(.+?)[ \t]*\n(-+|=+)[ \t]*\n)/cg) {
return CloseHtmlEnvironments()
. (substr($4,0,1) eq '=' ? $q->h2($3) : $q->h3($3))
. AddHtmlEnvironment('p');
}
# > blockquote
# with continuation
elsif ($bol and m/\G&gt;/gc) {
elsif ($bol and m/\G&gt;/cg) {
return CloseHtmlEnvironments()
. AddHtmlEnvironment('blockquote');
}
@@ -117,20 +117,20 @@ sub MarkdownRule {
. AddHtmlEnvironment('td');
}
# whitespace indentation = code
elsif ($bol and m/\G(\s*\n)*( .+)\n?/gc) {
elsif ($bol and m/\G(\s*\n)*( .+)\n?/cg) {
my $str = substr($2, 4);
while (m/\G( .*)\n?/gc) {
while (m/\G( .*)\n?/cg) {
$str .= "\n" . substr($1, 4);
}
return OpenHtmlEnvironment('pre',1) . $str; # always level 1
}
# ``` = code
elsif ($bol and m/\G```[ \t]*\n(.*?)\n```[ \t]*(\n|$)/gcs) {
elsif ($bol and m/\G```[ \t]*\n(.*?)\n```[ \t]*(\n|$)/cgs) {
return CloseHtmlEnvironments() . $q->pre($1)
. AddHtmlEnvironment("p");
}
# [an example](http://example.com/ "Title")
elsif (m/\G\[(.+?)\]\($FullUrlPattern(\s+"(.+?)")?\)/goc) {
elsif (m/\G\[(.+?)\]\($FullUrlPattern(\s+"(.+?)")?\)/cg) {
my ($text, $url, $title) = ($1, $2, $4);
$url =~ /^($UrlProtocols)/;
my %params;

View File

@@ -142,18 +142,18 @@ sub MarkupTag {
}
sub MarkupRule {
if ($bol and %MarkupLines and m/$markup_lines_re/gc) {
if ($bol and %MarkupLines and m/$markup_lines_re/cg) {
my ($tag, $str) = ($1, $2);
$str = $q->span($tag) . $str;
while (m/$markup_lines_re/gc) {
while (m/$markup_lines_re/cg) {
$str .= $q->span($1) . $2;
}
return CloseHtmlEnvironments()
. MarkupTag($MarkupLines{UnquoteHtml($tag)}, $str)
. AddHtmlEnvironment('p');
} elsif (%MarkupSingles and m/$markup_singles_re/gc) {
} elsif (%MarkupSingles and m/$markup_singles_re/cg) {
return $MarkupSingles{UnquoteHtml($1)};
} elsif (%MarkupForcedPairs and m/$markup_forced_pairs_re/gc) {
} elsif (%MarkupForcedPairs and m/$markup_forced_pairs_re/cg) {
my $tag = $1;
my $start = $tag;
my $end = $tag;
@@ -168,20 +168,20 @@ sub MarkupRule {
$endre .= '[ \t]*\n?' if $block_element{$start}; # skip trailing whitespace if block
# may match the empty string, or multiple lines, but may not span
# paragraphs.
if ($endre and m/\G$endre/gc) {
if ($endre and m/\G$endre/cg) {
return $tag . $end;
} elsif ($tag eq $end && m/\G((:?.+?\n)*?.+?)$endre/gc) { # may not span paragraphs
} elsif ($tag eq $end && m/\G((:?.+?\n)*?.+?)$endre/cg) { # may not span paragraphs
return MarkupTag($data, $1);
} elsif ($tag ne $end && m/\G((:?.|\n)+?)$endre/gc) {
} elsif ($tag ne $end && m/\G((:?.|\n)+?)$endre/cg) {
return MarkupTag($data, $1);
} else {
return $tag;
}
} elsif (%MarkupPairs and m/$markup_pairs_re/gc) {
} elsif (%MarkupPairs and m/$markup_pairs_re/cg) {
return MarkupTag($MarkupPairs{UnquoteHtml($1)}, $2);
} elsif ($MarkupPairs{'/'} and m|\G~/|gc) {
} elsif ($MarkupPairs{'/'} and m|\G~/|cg) {
return '~/'; # fix ~/elisp/ example
} elsif ($MarkupPairs{'/'} and m|\G(/[-A-Za-z0-9\x{0080}-\x{fffd}/]+/$words/)|gc) {
} elsif ($MarkupPairs{'/'} and m|\G(/[-A-Za-z0-9\x{0080}-\x{fffd}/]+/$words/)|cg) {
return $1; # fix /usr/share/lib/! example
}
# "foo

View File

@@ -68,7 +68,7 @@ sub BisectInitialScreen {
}
print $q->submit(-name=>'bad', -value=>T('Start'));
} else {
print T('Biscecting proccess is already active.'), $q->br();
print T('Bisecting proccess is already active.'), $q->br();
print $q->submit(-name=>'stop', -value=>T('Stop'));
}
print $q->end_form();
@@ -102,7 +102,7 @@ sub BisectProcess {
print $q->end_form();
return;
}
print T('Module count (only testable modules): '), $q->strong(scalar @files), $q->br();
print T('Module count (only testable modules):'), ' ', $q->strong(scalar @files), $q->br();
print $q->br(), T('Current module statuses:'), $q->br();
my $halfsize = ($end - $start + 1) / 2.0; # + 1 because it is count
$end -= int($halfsize) unless $isGood;

View File

@@ -64,13 +64,13 @@ sub MoinListLevel {
sub MoinRule {
# ["free link"]
if (m/\G(\["(.*?)"\])/gcs) {
if (m/\G(\["(.*?)"\])/cgs) {
Dirty($1);
print GetPageOrEditLink($2);
return '';
}
# [[BR]]
elsif (m/\G\[\[BR\]\]/gc) {
elsif (m/\G\[\[BR\]\]/cg) {
return $q->br();
}
# {{{

View File

@@ -46,13 +46,13 @@ sub NewMultiUrlBannedContent {
sub MultiUrlBannedContent {
my $str = shift;
my @urls = $str =~ /$FullUrlPattern/go;
my @urls = $str =~ /$FullUrlPattern/g;
my %domains;
my %whitelist;
my $max = 0;
my $label = '[a-z]([a-z0-9-]*[a-z0-9])?'; # RFC 1034
foreach (split(/\n/, GetPageContent($MultiUrlWhiteList))) {
next unless m/^\s*($label\.$label)/io;
next unless m/^\s*($label\.$label)/i;
$whitelist{$1} = 1;
}
foreach my $url (@urls) {

View File

@@ -227,7 +227,7 @@ sub NewNamespaceGetRcLines { # starttime, hash of seen pages to use as a second
foreach my $file (@rcfiles) {
open(my $F, '<:encoding(UTF-8)', $file);
my $line = <$F>;
my ($ts) = split(/$FS/o, $line); # the first timestamp in the regular rc file
my ($ts) = split(/$FS/, $line); # the first timestamp in the regular rc file
my @new;
if (not $ts or $ts > $starttime) { # we need to read the old rc file, too
push(@new, GetRcLinesFor($rcoldfiles{$file}, $starttime,\%match, \%following));
@@ -381,7 +381,8 @@ sub NewNamespaceBrowsePage {
#REDIRECT into different namespaces
my ($id, $raw, $comment, $status) = @_;
OpenPage($id);
my ($text, $revision) = GetTextRevision(GetParam('revision', ''), 1);
my ($revisionPage, $revision) = GetTextRevision(GetParam('revision', ''), 1);
my $text = $revisionPage->{text};
my $oldId = GetParam('oldid', '');
if (not $oldId and not $revision and (substr($text, 0, 10) eq '#REDIRECT ')
and (($WikiLinks and $text =~ /^\#REDIRECT\s+(($InterSitePattern:)?$InterLinkPattern)/)

View File

@@ -185,7 +185,7 @@ sub NewNearLinksResolveId {
my $id = shift;
my @result = OldNearLinksResolveId($id, @_);
my %forbidden = map { $_ => 1 } @UserGotoBarPages, %AdminPages;
$forbidden{$id} = 1 if $CommentsPrefix and $id =~ /^$CommentsPrefix/o;
$forbidden{$id} = 1 if $CommentsPrefix and $id =~ /^$CommentsPrefix/;
if (not $result[1] and $NearSource{$id} and not $forbidden{$id}) {
$NearLinksUsed{$id} = 1;
my $site = $NearSource{$id}[0];
@@ -264,18 +264,18 @@ sub SearchNearPages {
if (%NearSearch and GetParam('near', 1) > 1 and GetParam('context',1)) {
foreach my $site (keys %NearSearch) {
my $url = $NearSearch{$site};
$url =~ s/\%s/UrlEncode($string)/ge or $url .= UrlEncode($string);
$url =~ s/\%s/UrlEncode($string)/eg or $url .= UrlEncode($string);
print $q->hr(), $q->p(Ts('Fetching results from %s:', $q->a({-href=>$url}, $site)))
unless GetParam('raw', 0);
my $data = GetRaw($url);
my @entries = split(/\n\n+/, $data);
shift @entries; # skip head
foreach my $entry (@entries) {
my %entry = ParseData($entry); # need to pass reference
my $name = $entry{title};
my $entryPage = ParseData($entry); # need to pass reference
my $name = $entryPage->{title};
next if $found{$name}; # do not duplicate local pages
$found{$name} = 1;
PrintSearchResultEntry(\%entry, $regex); # with context and full search!
PrintSearchResultEntry($entryPage, $regex); # with context and full search!
}
}
}

View File

@@ -24,8 +24,8 @@ our ($q, @MyRules, $FullUrlPattern, $UrlProtocols, $BracketText);
push(@MyRules, \&NewWindowLink);
sub NewWindowLink {
# compare sub LinkRules in oddmuse.pl
if ($BracketText && m/\G(\[new:$FullUrlPattern\s+([^\]]+?)\])/cog
or m/\G(\[new:$FullUrlPattern\])/cog) {
if ($BracketText && m/\G(\[new:$FullUrlPattern\s+([^\]]+?)\])/cg
or m/\G(\[new:$FullUrlPattern\])/cg) {
my ($url, $text) = ($2, $3);
$url =~ /^($UrlProtocols)/;
my $class = "url $1"; # get protocol (http, ftp, ...)

View File

@@ -25,7 +25,7 @@ sub NewGetSearchLink {
my ($text, $class, $name, $title) = @_;
$name = UrlEncode($name);
$text =~ s/_/ /g;
return $q->span({-class=>$class }, $text);
return $q->span({-class=>$class}, $text);
}
push(@MyAdminCode, \&BacklinksMenu);
@@ -34,8 +34,8 @@ sub BacklinksMenu {
if ($id) {
my $text = T('Backlinks');
my $class = 'backlinks';
my $name = "backlinks";
my $title = T("Click to search for references to this page");
my $name = 'backlinks';
my $title = T('Click to search for references to this page');
my $link = ScriptLink('search=' . $id, $text, $class, $name, $title);
push(@$menuref, $link);
}

View File

@@ -25,8 +25,8 @@ push(@MyRules, \&NumberedListRule);
sub NumberedListRule {
# numbered lists using # copied from usemod.pl but allow leading
# whitespace
if ($bol && m/\G(\s*\n)*\s*(\#+)[ \t]/cog
or InElement('li') && m/\G(\s*\n)+\s*(\#+)[ \t]/cog) {
if ($bol && m/\G(\s*\n)*\s*(\#+)[ \t]/cg
or InElement('li') && m/\G(\s*\n)+\s*(\#+)[ \t]/cg) {
return CloseHtmlEnvironmentUntil('li')
. OpenHtmlEnvironment('ol',length($2))
. AddHtmlEnvironment('li');

View File

@@ -78,7 +78,7 @@ sub DoManifest {
# print ScriptUrl($id) . "\n" unless $IndexHash{$id};
# }
# External CSS
print $StyleSheet . "\n" if $StyleSheet;
print ref $StyleSheet ? join("\n", @$StyleSheet) . "\n" : "$StyleSheet\n" if $StyleSheet;
# FIXME: $StyleSheetPage
# FIXME: external images, stuff in $HtmlHeaders
# Error message all the stuff that's not available offline.

View File

@@ -99,9 +99,9 @@ sub LocalMapWorkHorse {
my $retval_children = '';
if ($depth > 0) {
my %data = ParseData(ReadFileOrDie(GetPageFile($id)));
my @flags = split(/$FS/, $data{'flags'});
my @blocks = split(/$FS/, $data{'blocks'});
my $data = ParseData(ReadFileOrDie(GetPageFile($id)));
my @flags = split(/$FS/, $data->{'flags'});
my @blocks = split(/$FS/, $data->{'blocks'});
my @subpages;
# Iterate over blocks, operate only on "dirty" ones
@@ -111,14 +111,14 @@ sub LocalMapWorkHorse {
local $_ = $blocks[$i];
if ($WikiLinks
&& ($BracketWiki && m/\G(\[$LinkPattern\s+([^\]]+?)\])/cog
or m/\G(\[$LinkPattern\])/cog or m/\G($LinkPattern)/cog)) {
&& ($BracketWiki && m/\G(\[$LinkPattern\s+([^\]]+?)\])/cg
or m/\G(\[$LinkPattern\])/cg or m/\G($LinkPattern)/cg)) {
$sub_id = $1;
} elsif ($FreeLinks
&& (($BracketWiki
&& m/\G(\[\[($FreeLinkPattern)\|([^\]]+)\]\])/cog)
or m/\G(\[\[\[($FreeLinkPattern)\]\]\])/cog
or m/\G(\[\[($FreeLinkPattern)\]\])/cog)) {
&& m/\G(\[\[($FreeLinkPattern)\|([^\]]+)\]\])/cg)
or m/\G(\[\[\[($FreeLinkPattern)\]\]\])/cg
or m/\G(\[\[($FreeLinkPattern)\]\])/cg)) {
$sub_id = $2;
}

View File

@@ -27,7 +27,7 @@ my $org_emph_re = qr!\G([ \t('\"])*(([*/_=+])([^ \t\r\n,*/_=+].*?(?:\n.*?){0,1}[
my %org_emphasis_alist = qw!* b / i _ u = code + del!;
sub OrgModeRule {
if (/$org_emph_re/cgo) {
if (/$org_emph_re/cg) {
my $tag = $org_emphasis_alist{$3};
return "$1<$tag>$4</$tag>$5";
}

View File

@@ -54,7 +54,7 @@ sub UpdatePageTrail {
sub NewPageTrailGetGotoBar {
my $bar = OldPageTrailGetGotoBar(@_);
$bar .= $q->span({-class=>'trail'}, $q->br(), T('Trail: '),
$bar .= $q->span({-class=>'trail'}, $q->br(), T('Trail:') . ' ',
map { GetPageLink($_) } reverse(@PageTrail))
if @PageTrail;
return $bar;

View File

@@ -28,7 +28,7 @@ push(@MyRules, \&ParagraphLinkRule);
$RuleOrder{\&ParagraphLinkRule} = 100;
sub ParagraphLinkRule {
if ($bol && m/\G(\[(-)?$FreeLinkPattern\])/cog) {
if ($bol && m/\G(\[(-)?$FreeLinkPattern\])/cg) {
Dirty($1);
my $invisible = $2;
my $orig = $3;

View File

@@ -27,7 +27,7 @@ our ($q, %Page, @MyRules, $CommentsPrefix);
push(@MyRules, \&PartialCutRule);
sub PartialCutRule {
if (m/\G(?<=\n)\s*--\s*cut\s*--\s*(?=\n)/gc) {
if (m/\G(?<=\n)\s*--\s*cut\s*--\s*(?=\n)/cg) {
return CloseHtmlEnvironments() . '<hr class="cut" />' . AddHtmlEnvironment('p');
}
return;

View File

@@ -78,7 +78,7 @@ push(@MyRules, \&PermanentAnchorsRule);
sub PermanentAnchorsRule {
my ($locallinks, $withanchors) = @_;
if (m/\G(\[::$FreeLinkPattern\])/cog) {
if (m/\G(\[::$FreeLinkPattern\])/cg) {
#[::Free Link] permanent anchor create only $withanchors
Dirty($1);
if ($withanchors) {
@@ -99,7 +99,7 @@ sub GetPermanentAnchor {
ScriptLink(UrlEncode($resolved), $text, 'alias')) . ']';
} elsif ($PermanentAnchors{$id} ne $OpenPageName
# 10 tries, 3 second wait, die on error
and RequestLockDir('permanentanchors', 1, 10, 3, 1)) {
and RequestLockDir('permanentanchors', 10, 3, 1)) {
# Somebody may have added a permanent anchor in the mean time.
# Comparing $LastUpdate to the $IndexFile mtime does not work for
# subsecond changes and updates are rare, so just reread the file!
@@ -190,7 +190,7 @@ sub NewPermanentAnchorsSave {
sub DeletePermanentAnchors {
my $id = shift;
# 10 tries, 3 second wait, die on error
RequestLockDir('permanentanchors', 1, 10, 3, 1);
RequestLockDir('permanentanchors', 10, 3, 1);
foreach (keys %PermanentAnchors) {
if ($PermanentAnchors{$_} eq $id and !$PagePermanentAnchors{$_}) {
delete($PermanentAnchors{$_}) ;
@@ -246,7 +246,7 @@ sub NewPermanentAnchorsGetPageContent {
if (not $result and $PermanentAnchors{$id}) {
$result = OldPermanentAnchorsGetPageContent($PermanentAnchors{$id});
$result =~ s/^(.*\n)*.*\[::$id\]// or return '';
$result =~ s/(\n=|\n----|\[::$FreeLinkPattern\])(.*\n)*.*$//o;
$result =~ s/(\n=|\n----|\[::$FreeLinkPattern\])(.*\n)*.*$//;
}
return $result;
}

View File

@@ -45,7 +45,7 @@ sub CommentFooterLink {
my @elements;
if ($id and $rev ne 'history' and $rev ne 'edit') {
if ($CommentsPrefix) {
if ($OpenPageName =~ /^$CommentsPrefix(.*)/o) {
if ($OpenPageName =~ /^$CommentsPrefix(.*)/) {
push(@elements, GetPageLink($1, undef, 'original'));
} else {
push(@elements, GetPageLink($CommentsPrefix . $OpenPageName, undef, 'comment'));

View File

@@ -21,8 +21,8 @@ AddModuleDescription('portrait-support.pl', 'Portraits Support Extension');
our ($q, $bol, $Now, @MyMacros, @MyRules, $FreeLinkPattern, $UrlPattern, $FS);
push(@MyMacros, sub{ s/\[new::\]/"[new:" . GetParam('username', T('Anonymous'))
. ':' . TimeToText($Now) . "]"/ge });
push(@MyMacros, sub{ s/\[new:$FreeLinkPattern\]/"[new:$1:" . TimeToText($Now) . "]"/ge });
. ':' . TimeToText($Now) . "]"/eg });
push(@MyMacros, sub{ s/\[new:$FreeLinkPattern\]/"[new:$1:" . TimeToText($Now) . "]"/eg });
push(@MyRules, \&PortraitSupportRule);
@@ -41,9 +41,9 @@ sub PortraitSupportRule {
. $q->hr() . AddHtmlEnvironment('p');
$PortraitSupportColorDiv = 0;
return $html;
} elsif ($bol && m/\Gportrait:$UrlPattern/gc) {
} elsif ($bol && m/\Gportrait:$UrlPattern/cg) {
return $q->img({-src=>$1, -alt=>T("Portrait"), -class=>'portrait'});
} elsif ($bol && m/\G(:*)\[new(.*)\]/gc) {
} elsif ($bol && m/\G(:*)\[new(.*)\]/cg) {
my $portrait = '';
my $depth = length($1);
my ($ignore, $name, $time) = split(/:/, $2, 3);

65
modules/preview.pl Normal file
View File

@@ -0,0 +1,65 @@
# Copyright (C) 2015 Alex Schroeder <alex@gnu.org>
#
# This program is free software; you can redistribute it and/or modify it under
# the terms of the GNU General Public License as published by the Free Software
# Foundation; either version 3 of the License, or (at your option) any later
# version.
#
# This program is distributed in the hope that it will be useful, but WITHOUT
# ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
# FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License along with
# this program. If not, see <http://www.gnu.org/licenses/>.
use strict;
use v5.10;
=head1 Preview Extension
This module allows you to preview changes in HTML output. Oddmuse keeps a cache
of the HTML produced for each wiki page. If you install new modules, the HTML
produced for a page can change, but visitors will not see the new HTML because
the cache still contains the old HTML. Now you have two options: 1. clear the
HTML cache (Oddmuse will regenerate it as visitors look at the old pages), or 2.
edit each page in order to regenerate the HTML cache. What happens in practice
is that you add new modules and you aren't sure whether this breaks old pages
and so you do don't dare clear the HTML cache. Let sleeping dogs lie.
The Preview Extension produces a list of all the pages where the cached HTML
differs from the HTML produced by the current set of modules. If you agree with
the new HTML, feel free to clear the HTML cache. That's the point of this
extension.
=cut
our ($q, %Action, $UseCache, @MyAdminCode);
AddModuleDescription('preview.pl', 'Preview Extension');
$Action{preview} = \&DoPreview;
sub DoPreview {
print GetHeader('', T('Pages with changed HTML'));
print $q->start_div({-class=>'content preview'}), $q->start_p();
foreach my $id (AllPagesList()) {
OpenPage($id);
my $cache = ToString(\&PrintPageHtml);
local $UseCache = 0;
my $html = ToString(\&PrintPageHtml);
if ($cache ne $html) {
print GetPageLink($id), ' ',
ScriptLink("action=browse;id=$id;cache=0", T('Preview')),
$q->br();
}
}
print $q->end_p(), $q->end_div();
PrintFooter();
}
push(@MyAdminCode, \&PreviewMenu);
sub PreviewMenu {
my ($id, $menuref, $restref) = @_;
push(@$menuref, ScriptLink('action=preview', T('Preview changes in HTML output'), 'preview'));
}

View File

@@ -69,8 +69,8 @@ sub NewPrivatePagesUserCanEdit {
my $result = OldPrivatePagesUserCanEdit($id, $editing, @rest);
# bypass OpenPage and GetPageContent (these are redefined below)
if ($result > 0 and $editing and $IndexHash{$id}) {
my %data = ParseData(ReadFileOrDie(GetPageFile($id)));
if (PrivatePageLocked($data{text})) {
my $data = ParseData(ReadFileOrDie(GetPageFile($id)));
if (PrivatePageLocked($data->{text})) {
return 0;
}
}
@@ -128,11 +128,11 @@ sub NewPrivatePagesGetPageContent {
*GetTextRevision = \&NewPrivatePagesGetTextRevision;
sub NewPrivatePagesGetTextRevision {
my ($text, $revision) = OldPrivatePagesGetTextRevision(@_);
if (PrivatePageLocked($text)) {
return (NewPrivatePageNewText(), $revision);
my ($page, $revision) = OldPrivatePagesGetTextRevision(@_);
if (PrivatePageLocked($page->{text})) {
return ({text => NewPrivatePageNewText()}, $revision); # XXX faking a page object like this is not good
}
return ($text, $revision);
return wantarray ? ($page, $revision) : $page;
}
# hide #PASSWORD
@@ -140,7 +140,7 @@ sub NewPrivatePagesGetTextRevision {
push(@MyRules, \&PrivatePageRule);
sub PrivatePageRule {
if (pos == 0 && m/\G#PASSWORD.*\n/gc) {
if (pos == 0 && m/\G#PASSWORD.*\n/cg) {
return '';
}
return;

View File

@@ -21,7 +21,7 @@ use Crypt::Random::Seed;
AddModuleDescription('private-wiki.pl', 'Private Wiki Extension');
our ($q, $FS, @IndexList, %IndexHash, $IndexFile, $TempDir, $KeepDir);
our ($q, $FS, @IndexList, %IndexHash, $IndexFile, $TempDir, $KeepDir, %LockCleaners, $ShowAll);
my ($cipher, $random);
my $PrivateWikiInitialized = '';
@@ -216,6 +216,7 @@ sub DoDiff { # Actualy call the diff program
my $oldName = "$TempDir/old";
my $newName = "$TempDir/new";
RequestLockDir('diff') or return '';
$LockCleaners{'diff'} = sub { unlink $oldName if -f $oldName; unlink $newName if -f $newName; };
OldPrivateWikiWriteStringToFile($oldName, $_[0]); # CHANGED Here we use the old sub!
OldPrivateWikiWriteStringToFile($newName, $_[1]); # CHANGED
my $diff_out = `diff -- \Q$oldName\E \Q$newName\E`;
@@ -235,6 +236,9 @@ sub MergeRevisions { # merge change from file2 to file3 into file1
my ($name1, $name2, $name3) = ("$TempDir/file1", "$TempDir/file2", "$TempDir/file3");
CreateDir($TempDir);
RequestLockDir('merge') or return T('Could not get a lock to merge!');
$LockCleaners{'merge'} = sub { # CHANGED
unlink $name1 if -f $name1; unlink $name2 if -f $name2; unlink $name3 if -f $name3;
};
OldPrivateWikiWriteStringToFile($name1, $file1); # CHANGED
OldPrivateWikiWriteStringToFile($name2, $file2); # CHANGED
OldPrivateWikiWriteStringToFile($name3, $file3); # CHANGED
@@ -246,25 +250,6 @@ sub MergeRevisions { # merge change from file2 to file3 into file1
return $output;
}
*OldPrivateWikiCleanLock = \&CleanLock;
*CleanLock = \&NewPrivateWikiCleanLock;
sub NewPrivateWikiCleanLock {
my ($name) = @_;
if ($name eq 'diff') {
my $oldName = "$TempDir/old";
my $newName = "$TempDir/new";
unlink $oldName if -f $oldName;
unlink $newName if -f $newName;
} elsif ($name eq 'merge') {
my ($name1, $name2, $name3) = ("$TempDir/file1", "$TempDir/file2", "$TempDir/file3");
unlink $name1 if -f $name1;
unlink $name2 if -f $name2;
unlink $name3 if -f $name3;
}
OldPrivateWikiCleanLock(@_);
}
# Surge protection has to be unencrypted because in the context of this module
# it is a tool against people who have no password set (thus we have no key
# to do encryption).
@@ -277,7 +262,7 @@ sub ReadRecentVisitors {
%RecentVisitors = ();
return unless $status;
foreach (split(/\n/, $data)) {
my @entries = split /$FS/o;
my @entries = split /$FS/;
my $name = shift(@entries);
$RecentVisitors{$name} = \@entries if $name;
}
@@ -340,7 +325,7 @@ sub GetRcLines { # starttime, hash of seen pages to use as a second return value
open my $F, '<:encoding(UTF-8)', \$filelike or die $!; # CHANGED
my $line = <$F>;
my ($ts) = split(/$FS/o, $line); # the first timestamp in the regular rc file
my ($ts) = split(/$FS/, $line); # the first timestamp in the regular rc file
if (not $ts or $ts > $starttime) { # we need to read the old rc file, too
push(@result, GetRcLinesFor($RcOldFile, $starttime, \%match, \%following));
}
@@ -358,7 +343,7 @@ sub GetRcLinesFor {
my %following = %{$_[1]}; # deref
# parameters
my $showminoredit = GetParam('showedit', $ShowEdits); # show minor edits
my $all = GetParam('all', 0);
my $all = GetParam('all', $ShowAll);
my ($idOnly, $userOnly, $hostOnly, $clusterOnly, $filterOnly, $match, $lang,
$followup) = map { UnquoteHtml(GetParam($_, '')); }
qw(rcidonly rcuseronly rchostonly
@@ -372,7 +357,7 @@ sub GetRcLinesFor {
while (my $line = <$F>) {
chomp($line);
my ($ts, $id, $minor, $summary, $host, $username, $revision,
$languages, $cluster) = split(/$FS/o, $line);
$languages, $cluster) = split(/$FS/, $line);
next if $ts < $starttime;
$following{$id} = $ts if $followup and $followup eq $username;
next if $followup and (not $following{$id} or $ts <= $following{$id});
@@ -387,7 +372,7 @@ sub GetRcLinesFor {
next if $lang and @languages and not grep(/$lang/, @languages);
if ($PageCluster) {
($cluster, $summary) = ($1, $2) if $summary =~ /^\[\[$FreeLinkPattern\]\] ?: *(.*)/
or $summary =~ /^$LinkPattern ?: *(.*)/o;
or $summary =~ /^$LinkPattern ?: *(.*)/;
next if ($clusterOnly and $clusterOnly ne $cluster);
$cluster = '' if $clusterOnly; # don't show cluster if $clusterOnly eq $cluster
if ($all < 2 and not $clusterOnly and $cluster) {

View File

@@ -44,7 +44,7 @@ sub RelationRead {
}
sub RelationRule {
if (m/\G((forward@@|backward@@|forward@|backward@):([_A-Za-z0-9 ]+?);)/gc) {
if (m/\G((forward@@|backward@@|forward@|backward@):([_A-Za-z0-9 ]+?);)/cg) {
Dirty($1);
my $rememberpos = pos;
my $fwbw =$2;

View File

@@ -26,7 +26,7 @@ our (@MyRules);
push(@MyRules, \&SearchTagRule);
sub SearchTagRule {
if (m/\GTags:\s*(.+)/gc) {
if (m/\GTags:\s*(.+)/cg) {
my $tag_text = $1;
my @tags = split /,\s*/, $tag_text;
@tags = map {

View File

@@ -41,7 +41,7 @@ push(@MyRules, \&SeTextRule);
my $word = '([-A-Za-z\x{0080}-\x{fffd}]+)';
sub SeTextRule {
my $oldpos = pos;
if ($bol && ((m/\G((.+?)[ \t]*\n(-+|=+)[ \t]*\n)/gc
if ($bol && ((m/\G((.+?)[ \t]*\n(-+|=+)[ \t]*\n)/cg
and (length($2) == length($3)))
or ((pos = $oldpos) and 0))) {
my $html = CloseHtmlEnvironments() . ($PortraitSupportColorDiv ? '</div>' : '');
@@ -52,17 +52,17 @@ sub SeTextRule {
}
$PortraitSupportColorDiv = 0;
return $html . AddHtmlEnvironment('p');
} elsif ($bol && m/\G((&gt; .*\n)+)/gc) {
} elsif ($bol && m/\G((&gt; .*\n)+)/cg) {
my $text = $1;
return CloseHtmlEnvironments() . $q->pre($text) . AddHtmlEnvironment('p');
} elsif (m/\G\*\*($word( $word)*)\*\*/goc) {
} elsif (m/\G\*\*($word( $word)*)\*\*/cg) {
return "<b>$1</b>";
} elsif (m/\G~$word~/goc) {
} elsif (m/\G~$word~/cg) {
return "<i>$1</i>";
} elsif (m/\G\b_($word(_$word)*)_\b/goc) {
} elsif (m/\G\b_($word(_$word)*)_\b/cg) {
return '<em style="text-decoration: underline; font-style: normal;">'
. join(' ', split(/_/, $1)) . "</em>"; # don't clobber pos
} elsif (m/\G`_(.+)_`/gc) {
} elsif (m/\G`_(.+)_`/cg) {
return $1;
}
return;

View File

@@ -26,17 +26,17 @@ our ($Now, @MyRules, @MyMacros);
push(@MyRules, \&SignatureExceptionRule);
push(@MyMacros, sub{ s/(?<![!+])\+\+\+\+/'-- ' . GetParam('username', T('Anonymous'))
. ' ' . TimeToText($Now) /ge });
push(@MyMacros, sub{ s/(?<![!+])\+\+\+/'-- ' . GetParam('username', T('Anonymous'))/ge });
. ' ' . TimeToText($Now) /eg });
push(@MyMacros, sub{ s/(?<![!+])\+\+\+/'-- ' . GetParam('username', T('Anonymous'))/eg });
push(@MyMacros, sub{ s/(?<![!+])\~\~\~\~/GetParam('username', T('Anonymous'))
. ' ' . TimeToText($Now) /ge });
push(@MyMacros, sub{ s/(?<![!~])\~\~\~/GetParam('username', T('Anonymous'))/ge });
. ' ' . TimeToText($Now) /eg });
push(@MyMacros, sub{ s/(?<![!~])\~\~\~/GetParam('username', T('Anonymous'))/eg });
sub SignatureExceptionRule {
if (m/\G!\+\+\+/gc) {
if (m/\G!\+\+\+/cg) {
return '+++';
} elsif (m/\G!\~\~\~/gc) {
} elsif (m/\G!\~\~\~/cg) {
return '~~~';
}
return;

View File

@@ -68,7 +68,7 @@ sub NewSimpleRulesApplyRules {
}
($block =~ s/(\&lt;journal(\s+(\d*))?(\s+"(.*)")?(\s+(reverse))?\&gt;)/
my ($str, $num, $regexp, $reverse) = ($1, $3, $5, $7);
SimpleRulesDirty($str, sub { PrintJournal($num, $regexp, $reverse)});/ego);
SimpleRulesDirty($str, sub { PrintJournal($num, $regexp, $reverse)});/eg);
$result .= NewSimpleRulesApplyInlineRules($block);
}
}
@@ -78,14 +78,14 @@ sub NewSimpleRulesApplyRules {
sub NewSimpleRulesApplyInlineRules {
my ($block, $locallinks) = @_;
$block = NewSimpleRulesApplyDirtyInlineRules($block, $locallinks);
$block =~ s/$UrlPattern/SimpleRulesProtect($q->a({-href=>$1}, $1))/seg;
$block =~ s/$UrlPattern/SimpleRulesProtect($q->a({-href=>$1}, $1))/egs;
$block =~ s/~(\S+)~/SimpleRulesProtect($q->em($1))/eg;
$block =~ s/\*\*(.+?)\*\*/SimpleRulesProtect($q->strong($1))/seg;
$block =~ s/\/\/(.+?)\/\//SimpleRulesProtect($q->em($1))/seg;
$block =~ s/\_\_(.+?)\_\_/SimpleRulesProtect($q->u($1))/seg;
$block =~ s/\*(.+?)\*/SimpleRulesProtect($q->b($1))/seg;
$block =~ s/\/(.+?)\//SimpleRulesProtect($q->i($1))/seg;
$block =~ s/\_(.+?)\_/SimpleRulesProtect($q->u($1))/seg;
$block =~ s/\*\*(.+?)\*\*/SimpleRulesProtect($q->strong($1))/egs;
$block =~ s/\/\/(.+?)\/\//SimpleRulesProtect($q->em($1))/egs;
$block =~ s/\_\_(.+?)\_\_/SimpleRulesProtect($q->u($1))/egs;
$block =~ s/\*(.+?)\*/SimpleRulesProtect($q->b($1))/egs;
$block =~ s/\/(.+?)\//SimpleRulesProtect($q->i($1))/egs;
$block =~ s/\_(.+?)\_/SimpleRulesProtect($q->u($1))/egs;
return $block;
}
@@ -94,10 +94,10 @@ sub NewSimpleRulesApplyDirtyInlineRules {
if ($locallinks) {
($block =~ s/(\[\[$FreeLinkPattern\]\])/
my ($str, $link) = ($1, $2);
SimpleRulesDirty($str, GetPageOrEditLink($link,0,0,1))/ego);
SimpleRulesDirty($str, GetPageOrEditLink($link,0,0,1))/eg);
($block =~ s/(\[\[image:$FreeLinkPattern\]\])/
my ($str, $link) = ($1, $2);
SimpleRulesDirty($str, GetDownloadLink($link, 1))/ego);
SimpleRulesDirty($str, GetDownloadLink($link, 1))/eg);
}
return $block;
}
@@ -159,7 +159,7 @@ sub SimpleRulesMungeResult {
sub SimpleRulesUnprotect {
my $raw = shift;
$raw =~ s/$PROT([0-9]+)$PROT/$protected{$1}/ge
$raw =~ s/$PROT([0-9]+)$PROT/$protected{$1}/eg
while $raw =~ /$PROT([0-9]+)$PROT/; # find recursive replacements!
return $raw;
}

View File

@@ -28,8 +28,8 @@ file for your Oddmuse Wiki.
=cut
our ($SmartTitlesBrowserTitle,
$SmartTitlesBrowserTitleWithoutSubtitle,
$SmartTitlesSubUrlText);
$SmartTitlesBrowserTitleWithoutSubtitle,
$SmartTitlesSubUrlText);
=head2 $SmartTitlesBrowserTitle
@@ -94,7 +94,7 @@ that point.
=cut
sub SmartTitlesRule {
return '' if m/\G(^|\n)?#(TITLE|SUBTITLE|SUBURL:?)[ \t]+(.*?)\s*(\n+|$)/cg;
return '' if m/\G (^|\n)? \#(TITLE|SUBTITLE|SUBURL) [ \t]+ (.*?) \s*(\n+|$) /cgx;
return;
}
@@ -112,10 +112,10 @@ extensions (namely, hibernal) to obtain the title and subtitle for pages.
=cut
sub GetSmartTitles {
my ($title) = $Page{text} =~ m/(?:^|\n)\#TITLE[ \t]+(.*?)\s*\n+/;
my ($subtitle) = $Page{text} =~ m/(?:^|\n)\#SUBTITLE[ \t]+(.*?)\s*\n+/;
my ($interlink, $suburl) = $Page{text} =~ m/(?:^|\n)\#SUBURL(:)?[ \t]+(.*?)\s*\n+/;
return ($title, $subtitle, $suburl, $interlink ? 1 : '');
my ($title) = $Page{text} =~ m/ (?:^|\n) \#TITLE [ \t]+ (.*?) \s*\n+ /x;
my ($subtitle) = $Page{text} =~ m/ (?:^|\n) \#SUBTITLE [ \t]+ (.*?) \s*\n+ /x;
my ($suburl) = $Page{text} =~ m/ (?:^|\n) \#SUBURL [ \t]+ (.*?) \s*\n+ /x;
return ($title, $subtitle, $suburl);
}
=head2 GetHeaderSmartTitles
@@ -127,27 +127,30 @@ within that passed page's Wiki content.
sub GetHeaderSmartTitles {
my ($page_name, $title, undef, undef, undef, undef, $subtitle) = @_;
my ($smart_title, $smart_subtitle, $smart_suburl, $smart_interlink);
my ($smart_title, $smart_subtitle, $smart_suburl);
my $html_header = GetHeaderSmartTitlesOld(@_);
if ($page_name) {
OpenPage($page_name);
$title = NormalToFree($title);
($smart_title, $smart_subtitle, $smart_suburl, $smart_interlink) = GetSmartTitles();
($smart_title, $smart_subtitle, $smart_suburl) = GetSmartTitles();
}
$smart_title ||= $title;
$smart_subtitle ||= $subtitle;
$smart_title = QuoteHtml($smart_title);
$smart_title = QuoteHtml($smart_title);
$smart_subtitle = QuoteHtml($smart_subtitle);
$smart_suburl = QuoteHtml($smart_suburl);
$html_header =~ s~\Q>$title</a>\E~>$smart_title</a>~g;
if ($smart_subtitle) {
my $subtitlehtml = '<p class="subtitle">' . $smart_subtitle;
if ($smart_suburl) {
$subtitlehtml .= $smart_interlink ? GetInterLink($smart_suburl, undef, 1, 1)
: GetUrl($smart_suburl, $SmartTitlesSubUrlText, 1);
# ApplyRules is too much, we just want links. LinkRules should be enough.
# $subtitlehtml .= ' ' . ToString(sub { ApplyRules($smart_suburl, 1, 1) }) if $smart_suburl;
$_ = $smart_suburl;
$subtitlehtml .= ' ' . ToString(sub {LinkRules(1)});
}
$html_header =~ s~\Q</h1>\E~</h1>$subtitlehtml</p>~;
}
@@ -172,7 +175,7 @@ sub GetHeaderSmartTitles {
The information below applies to everything in this distribution,
except where noted.
Copyright 2014 Alex-Daniel Jakimenko <alex.jakimenko@gmail.com>
Copyright 2014-2015 Alex-Daniel Jakimenko <alex.jakimenko@gmail.com>
Copyleft 2008 by B.w.Curry <http://www.raiazome.com>.
Copyright 2006 by Charles Mauch <mailto://cmauch@gmail.com>.

View File

@@ -117,9 +117,9 @@ sub StaticFileName {
my ($status, $data) = ReadFile(GetPageFile(UrlDecode($id)));
# If the link points to a wanted page, we cannot make this static.
return $id unless $status;
my %hash = ParseData($data);
my $hash = ParseData($data);
my $ext = '.html';
if ($hash{text} =~ /^\#FILE ([^ \n]+ ?[^ \n]*)\n(.*)/s) {
if ($hash->{text} =~ /^\#FILE ([^ \n]+ ?[^ \n]*)\n(.*)/s) {
%StaticMimeTypes = StaticMimeTypes() unless %StaticMimeTypes;
$ext = $StaticMimeTypes{"$1"};
$ext = '.' . $ext if $ext;
@@ -232,7 +232,11 @@ EOT
sub StaticWriteCss {
my $css;
if ($StyleSheet) {
$css = GetRaw($StyleSheet);
if (ref $StyleSheet) {
$css = join '', map { GetRaw($_) } @$StyleSheet;
} else {
$css = GetRaw($StyleSheet);
}
}
if (not $css and $IndexHash{$StyleSheetPage}) {
$css = GetPageContent($StyleSheetPage);

View File

@@ -113,9 +113,9 @@ sub StaticFileName {
return $StaticFiles{$id} if $StaticFiles{$id}; # cache filenames
my ($status, $data) = ReadFile(GetPageFile(StaticUrlDecode($id)));
print "cannot read " . GetPageFile(StaticUrlDecode($id)) . $q->br() unless $status;
my %hash = ParseData($data);
my $hash = ParseData($data);
my $ext = '.html';
if ($hash{text} =~ /^\#FILE ([^ \n]+)\n(.*)/s) {
if ($hash->{text} =~ /^\#FILE ([^ \n]+)\n(.*)/s) {
$ext = $StaticMimeTypes{$1};
$ext = '.' . $ext if $ext;
}
@@ -125,7 +125,7 @@ sub StaticFileName {
sub StaticUrlDecode {
my $str = shift;
$str =~ s/%([0-9a-f][0-9a-f])/chr(hex($1))/ge;
$str =~ s/%([0-9a-f][0-9a-f])/chr(hex($1))/eg;
return $str;
}
@@ -410,7 +410,7 @@ sub StaticNewDoRollback {
my @ids = ();
if (not $page) { # cannot just use list length because of ('')
return unless UserIsAdminOrError(); # only admins can do mass changes
my %ids = map { my ($ts, $id) = split(/$FS/o); $id => 1; } # make unique via hash
my %ids = map { my ($ts, $id) = split(/$FS/); $id => 1; } # make unique via hash
GetRcLines($Now - $KeepDays * 86400, 1); # 24*60*60
@ids = keys %ids;
} else {
@@ -445,15 +445,15 @@ sub StaticNewDespamPage {
# from DoHistory()
my @revisions = sort {$b <=> $a} map { m|/([0-9]+).kp$|; $1; } GetKeepFiles($OpenPageName);
foreach my $revision (@revisions) {
my ($text, $rev) = GetTextRevision($revision, 1); # quiet
my ($revisionPage, $rev) = GetTextRevision($revision, 1); # quiet
if (not $rev) {
print ': ' . Ts('Cannot find revision %s.', $revision);
return;
} elsif (not DespamBannedContent($text)) {
} elsif (not DespamBannedContent($revisionPage->{text})) {
my $summary = Tss('Revert to revision %1: %2', $revision, $rule);
print ': ' . $summary;
Save($OpenPageName, $text, $summary) unless GetParam('debug', 0);
StaticDeleteFile($OpenPageName);
Save($OpenPageName, $revisionPage->{text}, $summary) unless GetParam('debug', 0);
StaticDeleteFile($OpenPageName);
return;
}
}
@@ -461,7 +461,7 @@ sub StaticNewDespamPage {
my $summary = Ts($rule). ' ' . Ts('Marked as %s.', $DeletedPage);
print ': ' . $summary;
Save($OpenPageName, $DeletedPage, $summary) unless GetParam('debug', 0);
StaticDeleteFile($OpenPageName);
StaticDeleteFile($OpenPageName);
} else {
print ': ' . T('Cannot find unspammed revision.');
}

View File

@@ -1,57 +0,0 @@
# Copyright (C) 20062015 Alex Schroeder <alex@gnu.org>
#
# This program is free software; you can redistribute it and/or modify it under
# the terms of the GNU General Public License as published by the Free Software
# Foundation; either version 3 of the License, or (at your option) any later
# version.
#
# This program is distributed in the hope that it will be useful, but WITHOUT
# ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
# FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License along with
# this program. If not, see <http://www.gnu.org/licenses/>.
use strict;
use v5.10;
AddModuleDescription('strange-spam.pl', 'StrangeBannedContent');
our (%AdminPages, $OpenPageName, @MyInitVariables, %LockOnCreation, %PlainTextPages, $BannedContent);
our ($StrangeBannedContent);
$StrangeBannedContent = 'StrangeBannedContent';
*StrangeOldBannedContent = \&BannedContent;
*BannedContent = \&StrangeNewBannedContent;
push(@MyInitVariables, \&StrangeBannedContentInit);
sub StrangeBannedContentInit {
$LockOnCreation{$StrangeBannedContent} = 1;
$AdminPages{$StrangeBannedContent} = 1;
$PlainTextPages{$StrangeBannedContent} = 1;
}
sub StrangeNewBannedContent {
my $str = shift;
my $rule = StrangeOldBannedContent($str, @_);
return $rule if $rule;
# changes here have effects on despam.pl!
foreach (split(/\n/, GetPageContent($StrangeBannedContent))) {
next unless m/^\s*([^#]+?)\s*(#\s*(\d\d\d\d-\d\d-\d\d\s*)?(.*))?$/;
my ($regexp, $comment) = ($1, $4);
if ($str =~ /($regexp)/ or $OpenPageName =~ /($regexp)/) {
my $match = $1;
$match =~ s/\n/ /g;
return Tss('Rule "%1" matched "%2" on this page.', QuoteHtml($regexp),
QuoteHtml($match)) . ' '
. ($comment
? Ts('Reason: %s.', $comment)
: T('Reason unknown.')) . ' '
. Ts('See %s for more information.',
GetPageLink($StrangeBannedContent));
}
}
return 0;
}

View File

@@ -27,11 +27,11 @@ push(@MyRules, \&SubscribedRecentChangesRule);
sub SubscribedRecentChangesRule {
if ($bol) {
if (m/\GMy\s+subscribed\s+pages:\s*((?:(?:$LinkPattern|\[\[$FreeLinkPattern\]\]),\s*)+)categories:\s*((?:(?:$LinkPattern|\[\[$FreeLinkPattern\]\]),\s*)*(?:$LinkPattern|\[\[$FreeLinkPattern\]\]))/gc) {
if (m/\GMy\s+subscribed\s+pages:\s*((?:(?:$LinkPattern|\[\[$FreeLinkPattern\]\]),\s*)+)categories:\s*((?:(?:$LinkPattern|\[\[$FreeLinkPattern\]\]),\s*)*(?:$LinkPattern|\[\[$FreeLinkPattern\]\]))/cg) {
return Subscribe($1, $4);
} elsif (m/\GMy\s+subscribed\s+pages:\s*((?:(?:$LinkPattern|\[\[$FreeLinkPattern\]\]),\s*)*(?:$LinkPattern|\[\[$FreeLinkPattern\]\]))/gc) {
} elsif (m/\GMy\s+subscribed\s+pages:\s*((?:(?:$LinkPattern|\[\[$FreeLinkPattern\]\]),\s*)*(?:$LinkPattern|\[\[$FreeLinkPattern\]\]))/cg) {
return Subscribe($1, '');
} elsif (m/\GMy\s+subscribed\s+categories:\s*((?:(?:$LinkPattern|\[\[$FreeLinkPattern\]\]),\s*)*(?:$LinkPattern|\[\[$FreeLinkPattern\]\]))/gc) {
} elsif (m/\GMy\s+subscribed\s+categories:\s*((?:(?:$LinkPattern|\[\[$FreeLinkPattern\]\]),\s*)*(?:$LinkPattern|\[\[$FreeLinkPattern\]\]))/cg) {
return Subscribe('', $1);
}
}

View File

@@ -42,7 +42,7 @@ sub NewSvgGetDownloadLink {
local (%Page, $OpenPageName);
OpenPage($name);
if ($revision) {
($data) = GetTextRevision($revision); # ignore revision reset
$data = GetTextRevision($revision)->{text}; # ignore revision reset
} else {
$data = $Page{text};
}
@@ -88,7 +88,7 @@ $Action{svg} = \&DoSvg;
sub DoSvg {
my $id = shift;
my $summary = T('Summary of your changes: ');
my $summary = T('Summary of your changes:') . ' ';
$HtmlHeaders .= qq{
<script type="text/javascript">

View File

@@ -27,7 +27,7 @@ push(@MyRules, \&SyncRule);
sub SyncRule {
# [[copy:http://example.com/wiki]]
if (m/\G\[\[(copy:$FullUrlPattern)\]\]/cog) {
if (m/\G\[\[(copy:$FullUrlPattern)\]\]/cg) {
my ($text, $url) = ($1, $2);
return $q->a({-href=>$2, class=>'outside copy'}, $text);
}

View File

@@ -49,7 +49,7 @@ my $TagXML;
sub TagRule { # Process page tags on a page
if ( m/\G$TagMark\s*(.*)/gc) { # find page tags
if ( m/\G$TagMark\s*(.*)/cg) { # find page tags
my @tags = split /,\s*/, $1; # push them in array
@tags = map { # and generate html output:
qq{<a href="$ScriptName?action=tagsearch;tag=$_">$_</a>}; # each tag is a link to search all pages with that tag
@@ -161,7 +161,7 @@ sub PrintTagMap {
my $tag = $1;
"<li>$tag</li>\n<ul>";
}xsge;
}egsx;
$result =~ s/\<\/tag\>/<\/ul>/g;
$result =~ s{
@@ -171,7 +171,7 @@ sub PrintTagMap {
my $name = $id;
$name =~ s/_/ /g;
"<li><a href=\"$ScriptName\/$id\">$name</a></li>";
}xsge;
}egsx;
print $result;
}

Some files were not shown because too many files have changed in this diff Show More