axiom-developer
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

RE: [Axiom-developer] Front page esthetic


From: Page, Bill
Subject: RE: [Axiom-developer] Front page esthetic
Date: Wed, 26 Jul 2006 22:21:12 -0400

Gaby, 

On Wednesday, July 26, 2006 9:44 PM you wrote:

> Bill Pagewrites:
> 
> [...]
> 
> | To make your changes to the SandBoxActiveAxiomDevelopers
> | just make sure that you enter some simple explanatory text
> | in the reason box before you click Save. Let me know if
> | you still have problems.
> 
> I just tried, it did not work :-(
> 

You got the following message:

  "Please check your comments or changes for unauthorized content."

In nice bright friendly red letters, right?

Ok, sorry I should have tried this before my previous long
explanation... Now I must subject you to another. :(

The reason this is being rejected is also do to another anti-spam
change to the Axiom Wiki system. A few weeks ago based on Bob
McElrath's suggestion a configured a feature of ZWiki that we had
not been previously using that provides for direct content filtering
of posted text. The idea is that it is possible to maintain a list
of words and links which are extremely unlikely to be part of any
valid post to the web site. I did this in response to about two or
three nasty spam emails that were getting through each day. But
these guys are devious and it turns out this requires rather
constant attention to make it keep working to filter out the bad
stuff. This is all based on simple regular expression pattern
matching. Nothing fancy.

What I was especially concerned about was the apparent large number
of robot programs that were being written specifically to target
wiki/ZWiki web sites like ours.

The short story is that I decided to disallow all HTML encoded
href links since that was the most common denomintator in all of
these spam attacks. This makes it a little harder to post a message
or a change to a web page if all you (usually a robot) know is how
to generate an HTML link. Fortunately our web site pages support a
shorter and fairly convenient idiom (called StructuredText) for
links that looks like this:

  "some text":http://xxx.com

which is translated by the ZWiki web page rendering as exactly

  <a href="http://xxx.com";>some text</a>

No robot seems to have figured out how to do this translation
automatically yet probably because of the number of wiki sites
that support StructuredText is too small to be worth the cost
of the modifications to the code.

So, what you have to do now (what I will do right now for you)
is to back convert these HTML encoded links to the StructuredText
form. Then everything will be fine. Of course you will also have
to remember to use the StructuredText.

I hope all of this nasty inconvenient stuff does not put you
off from making the changes that you intended to do. ;)

If anyone has some good suggestions for how to deal with this
spam problem stuff (without involving a lot of my programming
time!), I would be glad to hear it. And if anyone is motivated
to help solve this kind of problem, them I am certainly motivated
to spend as much time as it takes working with them... but I am
getting too tired and cranky by having to continue to play with
web server stuff, while what I really want is to devote more time
to mathematics and physics, rather than continue fix this kind
of stuff on my own now... :)

Regards,
Bill Page.




reply via email to

[Prev in Thread] Current Thread [Next in Thread]