From: Christopher T. <ch...@ch...> - 2002-08-19 02:28:11
|
Matt, > The problem will be coming up with enough different cases to > cover as many methods of "attack" as possible. Actually, we can simply ask all the gurus on the webappsec list to submit test cases to us. I'm sure they'll come up with a ton. > I don't see why a web server is necessary to test these cases? When it comes to testing, I always assume it is best to test code in an environment that mimcs the production environment. Different web servers and web app servers will process the input before web apps see it, and they might play with things like character encodings, etc. > While OWASP is geared towards web > applications, I don't see why someone couldn't use these filters in > other applications. An excellent point, and one I hope we pursue. Then again, the filtering needs of web applications are different from many standalone apps, and our first goal is to help web app developers. > We could write a test suite application in an appropriate language (I'm > thinking perl) that could invoke a simple application written in each > language that filters test data by a supplied filter, this simple app > then returns the filtered data to the test suite application which > determines if the output is correct. This can be easily run after each > build. I think the tricky part here is finding standalone interpreters for all the languages we will (eventually) support. I am thinking of things like Cold Fusion, Cocoon, Zope and Enhydra (which uses XMLC). Granted, these are not hugely popular platforms (please no flames, I'm only talking about market share :-), and I'm not suggesting we support them out of the box. But eventually, it might be nice. And if we were to do a JSP taglib version of the Filter API (which would also be nice), that would have to be run in a JSP page. Perl, PHP (you can use the compiled CGI binary), Python, Java, C, and C++ all have standalone interpreters. With respect to your Microsoft/ASP comment below, most ASP pages are written in VBScript (someone correct me if I'm wrong), for which there is Windows Scripting Host. I don't know about .NET, though. Your suggestion would be simpler than mine, and would generally make things easier on API porters. My thought on having just one tool for doing all the testing for all the languages we support is that maintaining that tool and the test cases would require less overhead, and would provide greater consistency across the languages. > ./testSuite --lang=java -f tests/xssTests.xml ? > > The test suite application doesn't have to run everything, only what we > tell it so people who aren't developing on other languages don't need to > install anything extra, other than the languages they are using and of > course the language that the test suite is written (which is why perl > would probably be best, its available on most major unix distributions > and I hear ActiveState Java is pretty decent). Besides, I would think > that we would want to package up the tools for each language for > individual distribution. ( I don't need the java filters for my php > applications). An excellent point. It's just that I'm not a very good Perl hacker. :-) Most developers hate writing tests (or comments or documentation), and I love writing all three, so I thought I could help there. On the flip side, it means we won't have one testing framework, we'll have one for every language we support, which could become a maintenance burden. > AND DAMMIT. I just realized that my test application idea breaks down > when we throw ASP into the mix. > (/me mumbles something dirty about Microsoft) See my comment above about VBScript. Chris |
From: Alex R. <al...@se...> - 2002-08-19 03:13:49
|
On Sun, 18 Aug 2002, Christopher Todd wrote: > Matt, > > > The problem will be coming up with enough different cases to > > cover as many methods of "attack" as possible. > > Actually, we can simply ask all the gurus on the webappsec list to submit > test cases to us. I'm sure they'll come up with a ton. > > > I don't see why a web server is necessary to test these cases? > > When it comes to testing, I always assume it is best to test code in an > environment that mimcs the production environment. Different web servers > and web app servers will process the input before web apps see it, and they > might play with things like character encodings, etc. we're going to be forcing canonicalization in our API anyway. Bring on the funky encodings. I see no need to have a server involved to muck with stuff when we can do it ourselves = ) > > We could write a test suite application in an appropriate language (I'm > > thinking perl) that could invoke a simple application written in each > > language that filters test data by a supplied filter, this simple app > > then returns the filtered data to the test suite application which > > determines if the output is correct. This can be easily run after each > > build. > > I think the tricky part here is finding standalone interpreters for all the > languages we will (eventually) support. I am thinking of things like Cold > Fusion, Cocoon, Zope and Enhydra (which uses XMLC). Granted, these are not > hugely popular platforms (please no flames, I'm only talking about market > share :-), and I'm not suggesting we support them out of the box. This is going to be grossly unpopular, but I say we get something working for langauages that are reasonable candidates for a reference implementation: Perl, PHP, Python, Java, C, C++. Among those I don't see any that can't be called from the command line, so I think that in the interest of getting something working, let's sidestep the complexity of mucking with a server until we absolutely need to do so. That might be sooner rather than later, but how much harder will it be to throw in a couple of wget calls once we've got the rest of the testing infrastructure in place? > Your suggestion would be simpler than mine, and would generally make things > easier on API porters. My thought on having just one tool for doing all the > testing for all the languages we support is that maintaining that tool and > the test cases would require less overhead, and would provide greater > consistency across the languages. I think eventually we'll get there, but I'm not so sure it makes sense unless we have our own test machines to play with. Until then, let's get some command line testing tools going. > An excellent point. It's just that I'm not a very good Perl hacker. :-) > Most developers hate writing tests (or comments or documentation), and I > love writing all three, so I thought I could help there. > > On the flip side, it means we won't have one testing framework, we'll have > one for every language we support, which could become a maintenance burden. Ick. I say let's do the testing framework in one language (Java's great with me) and let's have it do file-based/stdin-based invocation of the various interepreters and define some form of output that test programs should feed back to the caller so that we can determine success or failure. Extending this to parse HTTP server input shouldn't be much harder at all (simply request with a socket, feed our malicous input, parse HTTP reply in common format). Something like this might even be able to test against various servers running different configs once we have it working corretly. The first step though is to get it working at the command line with a set of reasonable bad input against a reasonable set of interpreters. Anyone think I'm smoking crack here? -- Alex Russell al...@Se... al...@ne... |
From: Matt W. <wi...@ce...> - 2002-08-19 03:55:49
|
On Sun, 2002-08-18 at 22:14, Alex Russell wrote: [...] > Ick. I say let's do the testing framework in one language (Java's great > with me) and let's have it do file-based/stdin-based invocation of the > various interepreters and define some form of output that test programs > should feed back to the caller so that we can determine success or > failure. > > Extending this to parse HTTP server input shouldn't be much harder at > all (simply request with a socket, feed our malicous input, parse HTTP > reply in common format). Something like this might even be able to test > against various servers running different configs once we have it > working corretly. The first step though is to get it working at the > command line with a set of reasonable bad input against a reasonable set > of interpreters. > > Anyone think I'm smoking crack here? I just think that perl will be the quickest and best way to handle the tests. LWP::simple anyone? and parsing? You'd actually consider something else? ;p I see no need for an elaborate java program to do all of this. While tests are important, they shouldn't require a huge investment. Should they? -matt -- Matthew Wirges Developer, CERIAS Incident Response Database wi...@ce... Credo quia absurdum est. |
From: Alex R. <al...@se...> - 2002-08-19 04:04:48
|
On 18 Aug 2002, Matt Wirges wrote: > On Sun, 2002-08-18 at 22:14, Alex Russell wrote: > I just think that perl will be the quickest Quickest in what sense? Let's avoid maintenience nightmares for the time being and do it in either Java or Python. > and best way to handle the > tests. LWP::simple anyone? and parsing? You'd actually consider > something else? ;p Yes. > I see no need for an elaborate java program to do all of this. While > tests are important, they shouldn't require a huge investment. > Should they? No, which I think is what I"m advocating. Let's get something working that's easily extensible, and chug with it. -- Alex Russell al...@Se... al...@ne... |
From: Matt W. <wi...@ce...> - 2002-08-19 04:21:44
|
On Sun, 2002-08-18 at 23:05, Alex Russell wrote: > On 18 Aug 2002, Matt Wirges wrote: > > > On Sun, 2002-08-18 at 22:14, Alex Russell wrote: > > I just think that perl will be the quickest > > Quickest in what sense? Let's avoid maintenience nightmares for the time > being and do it in either Java or Python. Nightmares in what way? We need to iterate over a set of test data, pass them through test applications in each lang's library, get and parse the results, then make a report so we can identify problems. IMO, using Java for such a simple task such as this is like writing a formmail in C++... > > > and best way to handle the > > tests. LWP::simple anyone? and parsing? You'd actually consider > > something else? ;p > > Yes. > > > I see no need for an elaborate java program to do all of this. While > > tests are important, they shouldn't require a huge investment. > > Should they? > > No, which I think is what I"m advocating. Let's get something working > that's easily extensible, and chug with it. > At any rate, we should probably be concentrating on the filters themselves, rather than their tests. Yes, the tests will need to be developed while we develop the filters, but we don't even have much past the "requirments" phase. -- Matthew Wirges Developer, CERIAS Incident Response Database wi...@ce... Credo quia absurdum est. |
From: Alex R. <al...@se...> - 2002-08-19 04:41:10
|
On 18 Aug 2002, Matt Wirges wrote: > We need to iterate over a set of test data, pass them through test > applications in each lang's library, get and parse the results, then > make a report so we can identify problems. right. Frankly I don't care that much about what you implement it in Chris, Go nuts. You have my consent to use whatever language or design you want. You have license. Check some code in, and we'll talk more about testing then. I fully expect we'll go through a couple of iterations, so let's get someting going and start collecting test cases. Thanks for taking the initiative. -- Alex Russell al...@Se... al...@ne... |
From: Nik C. <ni...@ni...> - 2002-08-19 13:45:00
|
as seen on Bugtraq today.. http://www.mricon.com/html/phpfilter.html havn't been able to break it yet. -Nik |