tclwebtest-general Mailing List for tclwebtest (Page 8)
Status: Abandoned
Brought to you by:
tils
This list is closed, nobody may subscribe to it.
2003 |
Jan
(31) |
Feb
(61) |
Mar
(34) |
Apr
(2) |
May
(3) |
Jun
(2) |
Jul
(11) |
Aug
(17) |
Sep
(8) |
Oct
(7) |
Nov
|
Dec
(3) |
---|---|---|---|---|---|---|---|---|---|---|---|---|
2004 |
Jan
(7) |
Feb
|
Mar
|
Apr
(1) |
May
(2) |
Jun
|
Jul
|
Aug
(1) |
Sep
|
Oct
|
Nov
|
Dec
(1) |
2005 |
Jan
(2) |
Feb
(6) |
Mar
(2) |
Apr
|
May
|
Jun
(1) |
Jul
|
Aug
|
Sep
(1) |
Oct
|
Nov
|
Dec
|
2006 |
Jan
|
Feb
(1) |
Mar
(6) |
Apr
|
May
|
Jun
|
Jul
|
Aug
|
Sep
|
Oct
|
Nov
|
Dec
|
2008 |
Jan
|
Feb
|
Mar
|
Apr
|
May
|
Jun
|
Jul
|
Aug
(1) |
Sep
|
Oct
|
Nov
|
Dec
|
2009 |
Jan
|
Feb
|
Mar
|
Apr
|
May
|
Jun
|
Jul
|
Aug
|
Sep
|
Oct
|
Nov
|
Dec
(1) |
From: Grzegorz A. H. <gr...@ef...> - 2003-02-07 11:52:57
|
On Thu, Feb 06, 2003 at 12:00:43PM +0100, Grzegorz Adam Hankiewicz wrote: > The bad news of this patch is that there must be some other > cookie problem hidden around, because the following script doesn't > work. It reaches the point of doing the submit, and it tries to > follow the built do_request url, but Google returns a 403 forbidden > access. I was wrong, Google forbids access to unpopular user agents, and tclwebtest is one of them The attached patch implements a new procedure which allows changing the user agent string, hence forcing Google to return 200 http code. -- Grzegorz Adam Hankiewicz, gr...@ef.... Tel: +34-94-472 35 89. eFaber SL, Maria Diaz de Haro, 68, 2 http://www.efaber.net/ 48920 Portugalete, Bizkaia (SPAIN) |
From: Grzegorz A. H. <gr...@ef...> - 2003-02-06 11:41:55
|
Hi. This patch allows following 301 redirections with -nocomplain, otherwise assertion_failed is called. Index: tclwebtest.tcl =================================================================== RCS file: /home/maincvs/efintranet/www/tclwebtest/prog/tclwebtest.tcl,v retrieving revision 1.18 diff -u -r1.18 tclwebtest.tcl --- tclwebtest.tcl 6 Feb 2003 11:03:23 -0000 1.18 +++ tclwebtest.tcl 6 Feb 2003 11:37:07 -0000 @@ -1902,11 +1902,18 @@ # is it a redirect ? - if { $http_status == "302" } { + if { $http_status == "302" || $http_status == "301" } { for { set i 0 } { $i < [llength $meta] } { incr i 2 } { if { [string match -nocase [lindex $meta $i] "location"] } { set location [lindex $meta [expr $i+1]] break + } + } + if { $http_status == "301" } { + if $nocomplain_p { + log "Attention! Redirection 301 was ignored, but please update your test unit, it's a bug!" + } else { + assertion_failed "Permanent redirection (301) are considered a test unit bug\nUse -nocomplain if needed." } } if $noredirect_p { -- Grzegorz Adam Hankiewicz, gr...@ef.... Tel: +34-94-472 35 89. eFaber SL, Maria Diaz de Haro, 68, 2 http://www.efaber.net/ 48920 Portugalete, Bizkaia (SPAIN) |
From: Grzegorz A. H. <gr...@ef...> - 2003-02-06 11:03:59
|
Hi. After being able to browse google with the previous cookie patch, I was unable to use their forms because they use get method. The following patch adds support for this. Note that there's a lot of global logic which could be shared, maybe it would be worth it to separate the current function in little pieces (get/post_init, get/post_process, get/post_finish) and use function pointers to it, so that the logic is common but the implementation is different. There are few lines of code now, so it doesn't look worth it, my instinct says otherwise, because there's place to implement the feature of searching a specific submit button (now, the get method only sends the first one it finds). The bad news of this patch is that there must be some other cookie problem hidden around, because the following script doesn't work. It reaches the point of doing the submit, and it tries to follow the built do_request url, but Google returns a 403 forbidden access. There must be something in the do_request headers tclwebtest sends which Google rejects, because I can copy/paste that url and use it directly from any browser. Using that query url in a one line tclwebtest script fails too. do_request http://www.google.com/ link follow "advanced search" field fill linux field select 20 field fill "Grzegorz Adam Hankiewicz" field fill "kill protector" field fill Microsoft # here fails with 403 form submit log [response text] -- Grzegorz Adam Hankiewicz, gr...@ef.... Tel: +34-94-472 35 89. eFaber SL, Maria Diaz de Haro, 68, 2 http://www.efaber.net/ 48920 Portugalete, Bizkaia (SPAIN) |
From: Grzegorz A. H. <gr...@ef...> - 2003-02-05 15:02:34
|
On Fri, Jan 24, 2003 at 10:10:35AM +0000, Tilmann Singer wrote: > * Grzegorz Adam Hankiewicz <gr...@ef...> [20030124 09:42]: > > But tclwebtest fails with a really funny cookie error: > > Yeah, that's a bug that I've heard about from before. Looks like the solution was easier than expected. Patch attached, it works with google and hotmail. However, hotmail still seems to have a problem with cookies, I try the following script: do_request http://hotmail.com/ link follow "E-Mail Account!" # Confirm we don't use IExplorer form find ~n cont form submit log [response text] And the answer is something like: Your Web browser options are currently set to disable cookies. To use .NET Passport, you must enable cookies." -- Grzegorz Adam Hankiewicz, gr...@ef.... Tel: +34-94-472 35 89. eFaber SL, Maria Diaz de Haro, 68, 2 http://www.efaber.net/ 48920 Portugalete, Bizkaia (SPAIN) |
From: Grzegorz A. H. <gr...@ef...> - 2003-02-04 11:28:41
|
Hi. How could I use a form with two different submit buttons, and simulate that I just pressed the last submit button instead of the first one? -- Grzegorz Adam Hankiewicz, gr...@ef.... Tel: +34-94-472 35 89. eFaber SL, Maria Diaz de Haro, 68, 2 http://www.efaber.net/ 48920 Portugalete, Bizkaia (SPAIN) |
From: Grzegorz A. H. <gr...@ef...> - 2003-01-31 11:38:54
|
On Wed, Jan 29, 2003 at 06:17:40PM +0100, Grzegorz Adam Hankiewicz wrote: > Besides, look at how the double slash confuses tclwebtest, it is > a shorthand to avoid typing "http:", but tclwebtest constructs an > incorrect relative url. This patch fixes the incorrect relative url construction with web pages like http://slashdot.org/. Index: tclwebtest.tcl =================================================================== RCS file: /home/maincvs/efintranet/www/tclwebtest/prog/tclwebtest.tcl,v retrieving revision 1.15 diff -u -r1.15 tclwebtest.tcl --- tclwebtest.tcl 31 Jan 2003 09:11:27 -0000 1.15 +++ tclwebtest.tcl 31 Jan 2003 11:28:11 -0000 @@ -2042,6 +2042,10 @@ same url again not supported return $::tclwebtest::url + } elseif { [string range $url 0 1] == "//" } { + # append protocol + regexp {(https?:).*} $::tclwebtest::url match protocol + return "$protocol$url" } elseif { [string range $url 0 0] == "/" } { # append host regexp {(https?://[^/]+)} $::tclwebtest::url match host_part Ignore the above line "same url again not supported", I have put it there temporarily in my copy to avoid crashing the server with infinite redirections. Looks like AOLserver doesn't give much memory space/stack to TCL, because running from the commandline I can endlessly watch the self redirection bug for minutes, until I get bored, of course. -- Grzegorz Adam Hankiewicz, gr...@ef.... Tel: +34-94-472 35 89. eFaber SL, Maria Diaz de Haro, 68, 2 http://www.efaber.net/ 48920 Portugalete, Bizkaia (SPAIN) |
From: Grzegorz A. H. <gr...@ef...> - 2003-01-31 09:21:29
|
On Fri, Jan 31, 2003 at 10:07:00AM +0100, Grzegorz Adam Hankiewicz wrote: > > The function still fails if you try to connect to a non-existant > > server. > > The attached patch fixes that. Forgot to say how the output of a log looks like: ----- START: in_memory string at [31/Jan/2003:10:16:14] ----- --- do_request for http://do.not.exist/ ignoring very bad http result: >>couldn't open socket: host is unreachable<< --- do_request for http://do.not.exist/ do_request did not return a page: 'couldn't open socket: host is unreachable'. HTTP status is 555 in script body line 2 ... SOURCE CODE ... log [do_request -nocomplain http://do.not.exist/] log [do_request http://do.not.exist/] ^^^^^ ERROR ^^^^^ ----- FAILED: in_memory string (took 1s) ----- -- Grzegorz Adam Hankiewicz, gr...@ef.... Tel: +34-94-472 35 89. eFaber SL, Maria Diaz de Haro, 68, 2 http://www.efaber.net/ 48920 Portugalete, Bizkaia (SPAIN) |
From: Grzegorz A. H. <gr...@ef...> - 2003-01-31 09:09:57
|
> The function still fails if you try to connect to a non-existant > server. The attached patch fixes that, but it's ugly, leaves the global variables in an undefined state, and there seem to be more places where such errors should be catched, but that requires a bigger rewrite or redesign of the function. Unwilling to get a big patch rejected, I'll let you to decide what to do with socket errors. Well, the truth is I still don't understand all the code of do_request, so I could introduce even more bugs (that tcl upvar stuff is still a mistery for me, no matter how many times I reread the manual). -- Grzegorz Adam Hankiewicz, gr...@ef.... Tel: +34-94-472 35 89. eFaber SL, Maria Diaz de Haro, 68, 2 http://www.efaber.net/ 48920 Portugalete, Bizkaia (SPAIN) |
From: Grzegorz A. H. <gr...@ef...> - 2003-01-30 18:28:23
|
My previous post advanced a problem with -noredirect, introduced before. This parameter makes do_request return the targeted url without following it. The problem was that it wasn't being set correctly when the redirection was followed. Hence, with this script: log "We ended up at: [do_request http://www.efaber.com/]" ...the output is: --- do_request for http://www.efaber.com/ http status: >>302<< following a redirect to: http://www.efaber.net --- do_request for http://www.efaber.net http status: >>200<< We ended up at: http://www.efaber.com/ So, the attached patch fixes this, and the result is now: --- do_request for http://www.efaber.com/ http status: >>302<< following a redirect to: http://www.efaber.net --- do_request for http://www.efaber.net http status: >>200<< We ended up at: http://www.efaber.net The peculiar reason for computing the commands for eval as a list are explained in a book about tcl I have right here, which says that using double quotes is bad because structure is lost. More importantly, my previous grief with a void redirection is now worse: since the empty url is processed correctly, the procedure absolute_link returns the same url, and hence tclwebtest runs into an infinite loop. This should point out that either "" as relative url is invalid, or there's a problem with tclwebtest for not being able to parse a chain of redirections ending on a void one (maybe my suggestion for a history variable is right after all?). -- Grzegorz Adam Hankiewicz, gr...@ef.... Tel: +34-94-472 35 89. eFaber SL, Maria Diaz de Haro, 68, 2 http://www.efaber.net/ 48920 Portugalete, Bizkaia (SPAIN) |
From: Grzegorz A. H. <gr...@ef...> - 2003-01-30 17:35:15
|
Hi. The following patch fixes the incorrect use of the followequiv variable and adds another character to the regular expresion detecting stupid broken redirections. With this patch, the following works (is redirected): do_request -followequiv http://www.borjanet.com/ Without the patch the redirection is not detected, because the server sends XHTML, whose tags end with '/>'. This exposes a little mistake on my part in a previous patch (the one with -noredirect), and a coding mistake/problem in tclwebtest, another patch will follow. -- Grzegorz Adam Hankiewicz, gr...@ef.... Tel: +34-94-472 35 89. eFaber SL, Maria Diaz de Haro, 68, 2 http://www.efaber.net/ 48920 Portugalete, Bizkaia (SPAIN) |
From: Grzegorz A. H. <gr...@ef...> - 2003-01-30 16:54:18
|
Hi. I'm testing a script against our server, but redirection fails. In order to track down the problem (which I supposed involved cookie handling) I added a new parameter to do_request, so that it can be told to not follow the redirection, but return the new location url. Patch attached (no_redirection_patch.gz). Now, after applying that patch I tested this code: do_request http://www.efaber.net/ log [cookies all] log "Trying to change language now" set url [link get_url Euskaraz] set url [do_request -noredirect $url] log "We were pointed at '$url'" log [cookies all] set url [do_request -noredirect $url] log "We were pointed at '$url'" log [cookies all] set url [do_request -noredirect $url] log "We were pointed at '$url'" log [cookies all] log "Now trying to do all automatically" do_request http://www.efaber.net/ link follow {English} The script does a manual redirection displaying the cookies' values at each individual step. But then I discovered that it's not a cookie problem. Attached is the log of the script (no_redirection_problem_log.gz), it shows how the manual do_request do well, accepting the cookies and setting their values. The last automatic "link follow" command fails after the second redirection, because it tries to follow a void url. Just to verify that the http headers were correct (they send a void Location field), I also attach the results of telnetting manually to the server. It returns a manual redirection, also with a blank value (no_redirection_problem_http_logs.gz). How should I deal with this? I presume this blank redirection AOLServer sends means that the browser should return to the original url before the redirections started, but looks a very awkward behaviour. Maybe this would require a "history" variable in tclwebtest? PD: It would be cool if link follow also understood the switches of do_request, but I understand this is a little difficult because it first calls link find. -- Grzegorz Adam Hankiewicz, gr...@ef.... Tel: +34-94-472 35 89. eFaber SL, Maria Diaz de Haro, 68, 2 http://www.efaber.net/ 48920 Portugalete, Bizkaia (SPAIN) |
From: Grzegorz A. H. <gr...@ef...> - 2003-01-30 14:47:15
|
On Wed, Jan 29, 2003 at 07:15:55PM +0000, Tilmann Singer wrote: > * Grzegorz Adam Hankiewicz <gr...@ef...> [20030129 18:26]: > > I'm having a little regex problem: > > > > link find "so\[:alpha:\].*s" > > The same rules as for tcl regexps as well: try to enclose in { } to > save [ ] etc. from the tcl interpreter. And in this particular case I > think you need to write {so[[:alpha:]].*s}. See 'man re_syntax'. Right. I was later trying {"so[[:alpha:]].*s"} but didn't notice the double quotes. "so\[\[:alpha:\]\].*s" works too, but it's not very readable. |
From: Tilmann S. <ti...@ti...> - 2003-01-29 19:22:55
|
* Grzegorz Adam Hankiewicz <gr...@ef...> [20030129 18:26]: > I'm having a little regex problem: > > link find "so\[:alpha:\].*s" The same rules as for tcl regexps as well: try to enclose in { } to save [ ] etc. from the tcl interpreter. And in this particular case I think you need to write {so[[:alpha:]].*s}. See 'man re_syntax'. til -- http://tsinger.com |
From: Grzegorz A. H. <gr...@ef...> - 2003-01-29 18:26:00
|
I'm having a little regex problem: [gradha@ws5:1] [~/tclwebtest]$ ./tclwebtest cus.txt ----- START: cus.txt at [29/ene/2003:19:25:46] ----- --- do_request for http://www.efaber.net/ http status: >>200<< No link found that matches '{so[:alpha:].*s}' in "cus.txt" line 3: do_request http://www.efaber.net/ link find "so\[:alpha:\].*s" ----- FAILED: cus.txt (took 1s) ----- DURATION: 1 1 of 1 tests FAILED: cus.txt I don't know how should I write the link find command so that I match the expression 'so-one_alphanumeric_character-any_characters-s'. The problem is with [:alpha:], if I don't escape it, it tries to be executed, but if I escape it, it doesn't seem to work. -- Grzegorz Adam Hankiewicz, gr...@ef.... Tel: +34-94-472 35 89. eFaber SL, Maria Diaz de Haro, 68, 2 http://www.efaber.net/ 48920 Portugalete, Bizkaia (SPAIN) |
From: Grzegorz A. H. <gr...@ef...> - 2003-01-29 17:43:27
|
On Wed, Jan 29, 2003 at 05:20:13PM +0000, Tilmann Singer wrote: > [...] I'll not be able to apply them before next week, hope it > can wait that long. Don't worry, I use a local CVS now, it will be a matter of syncing the copies when you apply them. -- Grzegorz Adam Hankiewicz, gr...@ef.... Tel: +34-94-472 35 89. eFaber SL, Maria Diaz de Haro, 68, 2 http://www.efaber.net/ 48920 Portugalete, Bizkaia (SPAIN) |
From: Tilmann S. <ti...@ti...> - 2003-01-29 17:27:01
|
Hi Grzegorz, Thanks for the mass of patches and bug reports. There's a backlog of mails from you in my inbox - I just wanted to tell you that I'm quite busy right now with paid stuff so that I'll not be able to apply them before next week, hope it can wait that long. Keep em comin' ;) cheers, Til -- http://tsinger.com |
From: Grzegorz A. H. <gr...@ef...> - 2003-01-29 17:20:45
|
Hi. Looks like the absence of a selftest for "link follow ~c" has let this bug survive, the search is done in the full html instead of just the content. Besides, look at how the double slash confuses tclwebtest, it is a shorthand to avoid typing "http:", but tclwebtest constructs an incorrect relative url. [gradha@ws5:0] [~/tclwebtest]$ ./tclwebtest cus.txt ----- START: cus.txt at [29/ene/2003:18:14:07] ----- --- do_request for http://slashdot.org/topics.shtml http status: >>200<< --- do_request for http://slashdot.org/ http status: >>200<< <A HREF="//slashdot.org/search.pl?topic=126"><IMG SRC="//images.slashdot.org/topics/topictech2.gif" WIDTH="60" HEIGHT="80" BORDER="0" ALT="Technology"></A> --- do_request for http://slashdot.org//slashdot.org/search.pl?topic=126 http status: >>404<< do_request did not return a page. HTTP status is 404 in "cus.txt" line 8: link find ~c "topics" log [link get_full] link follow ~c "topics" ----- FAILED: cus.txt (took 40s) ----- DURATION: 40 1 of 1 tests FAILED: cus.txt -- Grzegorz Adam Hankiewicz, gr...@ef.... Tel: +34-94-472 35 89. eFaber SL, Maria Diaz de Haro, 68, 2 http://www.efaber.net/ 48920 Portugalete, Bizkaia (SPAIN) |
From: Grzegorz A. H. <gr...@ef...> - 2003-01-29 17:05:58
|
This may not be obvious, but applies over the description of "link follow". Index: tclwebtest.tcl =================================================================== RCS file: /home/maincvs/efintranet/www/tclwebtest/prog/tclwebtest.tcl,v retrieving revision 1.9 diff -u -r1.9 tclwebtest.tcl --- tclwebtest.tcl 28 Jan 2003 18:06:41 -0000 1.9 +++ tclwebtest.tcl 29 Jan 2003 16:55:18 -0000 @@ -362,6 +362,11 @@ link follow "Back to contents" </pre></blockquote> + Note that after this command you can get the current URL looking + at the value of ::tclwebtest::url, usefull if you followed a + link by text and you want to store/verify the url tclwebtest + chose. + </dd> <dt><b><a name="link_all">all</a></b></dt> -- Grzegorz Adam Hankiewicz, gr...@ef.... Tel: +34-94-472 35 89. eFaber SL, Maria Diaz de Haro, 68, 2 http://www.efaber.net/ 48920 Portugalete, Bizkaia (SPAIN) |
From: Grzegorz A. H. <gr...@ef...> - 2003-01-28 18:08:52
|
Here it is. One is to be applied on tclwebtest.tcl, another over a selftest. -- Grzegorz Adam Hankiewicz, gr...@ef.... Tel: +34-94-472 35 89. eFaber SL, Maria Diaz de Haro, 68, 2 http://www.efaber.net/ 48920 Portugalete, Bizkaia (SPAIN) |
From: Grzegorz A. H. <gr...@ef...> - 2003-01-28 15:42:52
|
> Applying this patch over fixes that [...] Substitute that patch with the following one (the name should help you find which one), I found a few corner cases where that algorithm wouldn't find out the correct lines from the memory string. This newer version also doesn't use chop so much, so maybe it's more faster if TCL is faster retrieving indexes than chopping strings. -- Grzegorz Adam Hankiewicz, gr...@ef.... Tel: +34-94-472 35 89. eFaber SL, Maria Diaz de Haro, 68, 2 http://www.efaber.net/ 48920 Portugalete, Bizkaia (SPAIN) |
From: Grzegorz A. H. <gr...@ef...> - 2003-01-27 17:04:31
|
On Wed, Jan 22, 2003 at 07:13:35PM +0100, Grzegorz Adam Hankiewicz wrote: > Running generate_docs I found about a wrong variable, so here's > a fixed patch, it also improves a little bit the documentation of > the procedures. Ok, another thing which was missing in this patch was that when something failed while using memory strings, tclwebtest wouldn't log the lines which caused the error, since there's no file code_lines can open. Applying this patch over fixes that (you will note it also includes the change of ::tclwebtest::log to public), and now five lines of code are shown delimited with special markers. Example: ----- START: in_memory string at [27/Jan/2003:17:52:22] ----- --- do_request for http://www.efaber.net/ http status: >>200<< Assertion "[regexp -nocase $search_expr $::tclwebtest::body]" failed. in script body line 13 ... SOURCE CODE ... assert text "proporciona desarrollos de servicios web personalizados" #asdasd # verificación de código html assert full "<p><li><b>Opttimizar" ^^^^^ ERROR ^^^^^ ----- FAILED: in_memory string (took 1s) ----- The code to extract those lines is very rudimentary. I miss python's in memory file object abstraction! At the moment it just takes the code contained in the memory string and starts butchering it until only five lines of text are left. I could write it much better if [string first] allowed a starting character as parameter, because I wouldn't need to chop so many times the string, but looks like it doesn't. -- Grzegorz Adam Hankiewicz, gr...@ef.... Tel: +34-94-472 35 89. eFaber SL, Maria Diaz de Haro, 68, 2 http://www.efaber.net/ 48920 Portugalete, Bizkaia (SPAIN) |
From: Grzegorz A. H. <gr...@ef...> - 2003-01-27 16:12:38
|
Hi. I've found out that log is a nice command to use along with assert, since assert is silent when it succeeds, it may be good to move it to the public api. -- Grzegorz Adam Hankiewicz, gr...@ef.... Tel: +34-94-472 35 89. eFaber SL, Maria Diaz de Haro, 68, 2 http://www.efaber.net/ 48920 Portugalete, Bizkaia (SPAIN) |
From: Grzegorz A. H. <gr...@ef...> - 2003-01-27 15:05:53
|
On Fri, Jan 24, 2003 at 02:29:04PM +0000, Tilmann Singer wrote: > This might be an error introduced with the ad_proc rewrite (damn, we > need better self tests). Try to replace $nocomplain with $nocomplain_p > within the do_request proc and tell me if that works. It works. I generated this patch against my own CVS because the one at SF still doesn't have my previous one. The function still fails if you try to connect to a non-existant server. Index: tclwebtest.tcl =================================================================== RCS file: /home/maincvs/efintranet/www/tclwebtest/prog/tclwebtest.tcl,v retrieving revision 1.4 diff -u -r1.4 tclwebtest.tcl --- tclwebtest.tcl 22 Jan 2003 18:11:54 -0000 1.4 +++ tclwebtest.tcl 27 Jan 2003 14:58:28 -0000 @@ -1672,7 +1672,7 @@ } { link_variables regexp_defs - if $nocomplain { + if $nocomplain_p { set nocomplain_option "-nocomplain" } else { set nocomplain_option "" @@ -1782,9 +1782,11 @@ eval "do_request $nocomplain_option $followequiv_option $location" } elseif { $http_status != "200" } { - debug -lib "$nocomplain" - if { !$nocomplain } { + debug -lib "$nocomplain_p" + if { !$nocomplain_p } { assertion_failed "do_request did not return a page. HTTP status is $http_status" + } else { + log "Bad http answer ignored due to -nocomplain" } } -- Grzegorz Adam Hankiewicz, gr...@ef.... Tel: +34-94-472 35 89. eFaber SL, Maria Diaz de Haro, 68, 2 http://www.efaber.net/ 48920 Portugalete, Bizkaia (SPAIN) |
From: Tilmann S. <ti...@ti...> - 2003-01-24 14:30:14
|
Hi, * Grzegorz Adam Hankiewicz <gr...@ef...> [20030124 11:28]: > Trying the following script: > > foreach url {http://www.efaber.net/fallo http://www.efaber.net/} { > do_request -nocomplain $url > } This might be an error introduced with the ad_proc rewrite (damn, we need better self tests). Try to replace $nocomplain with $nocomplain_p within the do_request proc and tell me if that works. I would try it out myself but I just started working on the cookie stuff and then ran out of time so my local copy is not in a working state right now ... til -- http://tsinger.com |
From: Grzegorz A. H. <gr...@ef...> - 2003-01-24 11:27:45
|
Hi. Trying the following script: foreach url {http://www.efaber.net/fallo http://www.efaber.net/} { do_request -nocomplain $url } I get such a log: ----- START: in_memory string at [24/Jan/2003:12:20:39] ----- --- do_request for http://www.efaber.net/fallo http status: >>500<< do_request did not return a page. HTTP status is 500 in script body line 3 ----- FAILED: in_memory string (took 1s) ----- Looks like do_request notices the use of -nocomplain, but the test unit is interrupted. Is this expected? The log is exactly the same without -nocomplain. I didn't know what to change in do_request to make the test unit go forward ignoring the error. It also fails if the domain name doesn't exist, but with a socket traceback. -- Grzegorz Adam Hankiewicz, gr...@ef.... Tel: +34-94-472 35 89. eFaber SL, Maria Diaz de Haro, 68, 2 http://www.efaber.net/ 48920 Portugalete, Bizkaia (SPAIN) |