You can subscribe to this list here.
2001 |
Jan
|
Feb
|
Mar
|
Apr
|
May
|
Jun
|
Jul
|
Aug
|
Sep
|
Oct
|
Nov
|
Dec
(7) |
---|---|---|---|---|---|---|---|---|---|---|---|---|
2002 |
Jan
(4) |
Feb
(15) |
Mar
(22) |
Apr
(32) |
May
(24) |
Jun
(12) |
Jul
(8) |
Aug
(8) |
Sep
(4) |
Oct
(10) |
Nov
(49) |
Dec
(28) |
2003 |
Jan
(15) |
Feb
(17) |
Mar
(6) |
Apr
(1) |
May
(1) |
Jun
(1) |
Jul
|
Aug
|
Sep
|
Oct
|
Nov
|
Dec
|
From: Lennox W. <lwi...@lo...> - 2003-06-19 23:23:39
|
Hi,<?xml:namespace prefix = o ns = "urn:schemas-microsoft-com:office:office" /> We'd really appreciate your feedback about Grub. If you're no longer running the Grub Web crawling application on your computer or have considered not running it, we'd like to know why. By answering seven short questions, you can help us improve Grub to better suit your needs. The survey is at: http://www.looksmart.com/r?page=/isp/COM/includes/Feedback_Grub.html <http://www.looksmart.com/r?page=/isp/COM/includes/Feedback_Grub.html> Thanks, The Grub Team |
From: zeek <ze...@sp...> - 2003-05-07 04:23:32
|
Hello Grubbers, I'm running grub on 3 machines; WinXP, RedHat 7.3, and RedHat 8.0. I messed around with compilation for the RH systems but ran into trouble with ./configure wanting the metakit library even after I installed it. This isn't so important because I found Clinton Work's RPMs. http://www.scripty.com/grub/ The rh7.2 RPMs run just fine on my 7.3 but neither the rh8.0 or rh7.3 run on my 8.0 --I get this: ... [Tue May 06 23:49:55 2003] [debug] CrawlInsert(): 'http://www.abc.net.au/eyre/coverage.htm: size=25061 CRC=3568670052 STATUS=0 MIME=text/html' [Tue May 06 23:49:55 2003] [debug] GetRetrieve(): http://www.hkns.co.kr/2-1.htm [Tue May 06 23:49:55 2003] [debug] Error with CURL session. cURL error number: 6 I've been getting notices from my firewall that excessive attempts are coming in on port 53 (DNS) destined for the RH8.0 box. The port 53 traffic is coming from the DNS listed on the RH8.0 /etc/resolv.conf. Since I opened the port the errors are morre interesting: [Wed May 07 00:11:42 2003] [debug] GetRetrieve(): http://www.amazon.com/exec/obidos/tg/browse/-/851278/ref=br_bx_c_1_7/102-0106618 -5387335 [Wed May 07 00:11:46 2003] [debug] Error with CURL session. cURL error number: 7 [Wed May 07 00:11:46 2003] [debug] Error with CURL session. cURL error number: 7 [Wed May 07 00:11:46 2003] [debug] Error with CURL session. cURL error number: 28 [Wed May 07 00:11:46 2003] [debug] Error with CURL session. cURL error number: 7 [Wed May 07 00:11:46 2003] [debug] CrawlInsert(): 'http://www.alltimevideo.com/westerns_detail.asp?Name=Smith+Ballew: size=1 CRC=0 STATUS=4 MIME=text/plain' [Wed May 07 00:11:46 2003] [debug] GetRetrieve(): http://www.kayaknews.ca/yak/boats/Pyranhai3_222.shtml [Wed May 07 00:11:46 2003] [debug] Error with CURL session. cURL error number: 6 [Wed May 07 00:11:46 2003] [debug] Error with CURL session. cURL error number: 28 [Wed May 07 00:11:46 2003] [debug] Error with CURL session. cURL error number: 28 -- The end of my troubleshooting at this point is running named on the RH8.0 system --but I'm still getting errors. Any ideas? -zeek |
From: <Des...@sy...> - 2003-04-07 04:10:50
|
How to change number of URL ? instead of 500, I would like something = like 2000-10000 |
From: Kord C. <ko...@te...> - 2003-03-13 00:21:16
|
Grubsters, A new forum for discussion the Grub project is now online at: http://www.grub.org/forums/ Please begin using the forum for discussing any bug reports, issues or feature requests related to the Grub project. The forum is run on phpBB, and has tons more features than we have available on our forum here at sf.net. We are also in need of moderators on the new forum, so drop me a line at ko...@gr... if you are interested. Go Grub! Kord Campbell Client ID #1 |
From: <fo...@va...> - 2003-03-10 18:38:35
|
Hi, The problem is that I'm not running the windows-client... (linux version). On Mon, 10 Mar 2003, Igor Stojanovski wrote: > Folkert, > > The current Windows client has a scheduler that you can use to set your > client to crawl whenever you want it to. The time increments for > controling the crawl time is 15 minutes. Therefore, you could set the > client to crawl no more than 15 minutes. > > I hope this was helpful. > > Cheers, > > Ozra. > > On Sun, 16 Feb 2003, Folkert van Heusden wrote: > > > Hi, > > > > I have another suggestion: a time limit for the crawling > > process. Say, I want to crawl for no more then 5 minutes. > > Handy for people with a dial-up connection in countries > > where you pay per minute (the netherlands for example). > > > > > > Folkert van Heusden > > www.vanheusden.com > > > > > > > > ------------------------------------------------------- > > This sf.net email is sponsored by:ThinkGeek > > Welcome to geek heaven. > > http://thinkgeek.com/sf > > _______________________________________________ > > Grub-client mailing list > > Gru...@li... > > https://lists.sourceforge.net/lists/listinfo/grub-client > > > > |
From: <fo...@va...> - 2003-03-10 18:37:24
|
checking/unchecking is rather tricky for me: I'm running the client in console-mode on a linux-box through a telnet-session. On Mon, 10 Mar 2003, Igor Stojanovski wrote: > Folkert, > > Options to control crawling local URLs only are planned for the next > client release. You should be able to choose your preferences by simply > checking/unchecking check boxes from within the client's preferences > dialog box. > > Cheers, > > Ozra. > > On Sun, 16 Feb 2003, Folkert van Heusden wrote: > > > It would be nice if there were an option in grubclient to limit > > the crawling to certain URLs. Like: I only want to check the > > URLs on my local webserver (keetweej.vanheusden.com / > > vanheusden.com / testcentrumgouda.nl). > > > > > > Folkert. > > > > > > ------------------------------------------------------- > > This sf.net email is sponsored by:ThinkGeek > > Welcome to geek heaven. > > http://thinkgeek.com/sf > > _______________________________________________ > > Grub-client mailing list > > Gru...@li... > > https://lists.sourceforge.net/lists/listinfo/grub-client > > > > |
From: Igor S. <oz...@gr...> - 2003-03-10 18:29:40
|
Folkert, The current Windows client has a scheduler that you can use to set your client to crawl whenever you want it to. The time increments for controling the crawl time is 15 minutes. Therefore, you could set the client to crawl no more than 15 minutes. I hope this was helpful. Cheers, Ozra. On Sun, 16 Feb 2003, Folkert van Heusden wrote: > Hi, > > I have another suggestion: a time limit for the crawling > process. Say, I want to crawl for no more then 5 minutes. > Handy for people with a dial-up connection in countries > where you pay per minute (the netherlands for example). > > > Folkert van Heusden > www.vanheusden.com > > > > ------------------------------------------------------- > This sf.net email is sponsored by:ThinkGeek > Welcome to geek heaven. > http://thinkgeek.com/sf > _______________________________________________ > Grub-client mailing list > Gru...@li... > https://lists.sourceforge.net/lists/listinfo/grub-client > |
From: Igor S. <oz...@gr...> - 2003-03-10 18:25:23
|
Brian, At this point every client gets the same URL load. We plan to improve on this design either by giving larger data sets to the "more capable" clients, or by having the client get a new data set while still crawling the old one, thus minimizing idle crawl time. Cheers, Ozra. On Wed, 19 Feb 2003, Brian Heckathorne wrote: > > My request! > An option for amount of URLs to grab would be nice. My machine eats up 500 > URLs quickly. > > > > Brian > GJCN > > > > ------------------------------------------------------- > This sf.net email is sponsored by:ThinkGeek > Welcome to geek heaven. > http://thinkgeek.com/sf > _______________________________________________ > Grub-client mailing list > Gru...@li... > https://lists.sourceforge.net/lists/listinfo/grub-client > |
From: Igor S. <oz...@gr...> - 2003-03-10 18:22:36
|
Folkert, Options to control crawling local URLs only are planned for the next client release. You should be able to choose your preferences by simply checking/unchecking check boxes from within the client's preferences dialog box. Cheers, Ozra. On Sun, 16 Feb 2003, Folkert van Heusden wrote: > It would be nice if there were an option in grubclient to limit > the crawling to certain URLs. Like: I only want to check the > URLs on my local webserver (keetweej.vanheusden.com / > vanheusden.com / testcentrumgouda.nl). > > > Folkert. > > > ------------------------------------------------------- > This sf.net email is sponsored by:ThinkGeek > Welcome to geek heaven. > http://thinkgeek.com/sf > _______________________________________________ > Grub-client mailing list > Gru...@li... > https://lists.sourceforge.net/lists/listinfo/grub-client > |
From: Igor S. <oz...@gr...> - 2003-02-20 17:01:18
|
Feel free to run multiple instances of the grub client. All the efforts contribute to cumulative crawling totals. It may take some time for the client to actually show them correctly, but eventually it will. For example, the Windows client uploads the stats from the central server once an hour. Hope this helps, Ozra. On Thu, 20 Feb 2003, Guillermo BT wrote: > That's the same question I was going to do, because > I'm running the Windows grubclient and when trying to > run another client from a linux-box with same grub-user id, > the figures doesn't change (it doesn't download a thing and > it doesn't crawl at all). > > > Guillermo BT > gu...@fa... > > > On Wed, 19 Feb 2003, Brian Heckathorne wrote: > > > Should i run more clients on many machines with the same login/password? > > > > > > Brian > > GJCN > > > > > > > > ------------------------------------------------------- > > This SF.net email is sponsored by: SlickEdit Inc. Develop an edge. > > The most comprehensive and flexible code editor you can use. > > Code faster. C/C++, C#, Java, HTML, XML, many more. FREE 30-Day Trial. > > www.slickedit.com/sourceforge > > _______________________________________________ > > Grub-client mailing list > > Gru...@li... > > https://lists.sourceforge.net/lists/listinfo/grub-client > > > > > > ------------------------------------------------------- > This SF.net email is sponsored by: SlickEdit Inc. Develop an edge. > The most comprehensive and flexible code editor you can use. > Code faster. C/C++, C#, Java, HTML, XML, many more. FREE 30-Day Trial. > www.slickedit.com/sourceforge > _______________________________________________ > Grub-client mailing list > Gru...@li... > https://lists.sourceforge.net/lists/listinfo/grub-client > |
From: Igor S. <oz...@gr...> - 2003-02-20 16:54:43
|
Guillermo, Unfortinately, the client under *BSD runs unreliably at best. You will have to stick to your Windows and Linux clients. Fixing the *BSD client is something that's been lingering around for very long time, and perhaps one day we will get around fixing it ;) Cheers, Ozra. On Wed, 19 Feb 2003, Guillermo BT wrote: > > Hello, > > Now I'm getting this error (core dumped) when then client has run > for a few minutes (3 or 4): > > #------------------------------------------------------------------------------# > | Grub Distributed Network Crawling Agent Version 1.0.4 Feb 19, > 2003 | > | Tech Support Available: http://www.grub.org or Email us at > su...@gr... | > #------------------------------------------------------------------------------# > | URL Set Completion (500 URLS) > | > | > [################--------------------------------------------------------] > | > Fatal error 'siglongjmp()ing between thread contexts is undefined by POSIX > 1003.1' at line ? in file /usr/src/lib/libc_r/uthread/uthread_jmp.c (errno > = ?) > > Abort trap (core dumped) > > > Is there any hints for this?? > > BTW: if its useful, this same computer also give me problems when > using the distributed.net client (I suppose you know what I'm talking > about) > > Guillermo BT. > gu...@fa... > > > On Tue, 18 Feb 2003, Igor Stojanovski wrote: > > > Guillermo, > > > > Try deleting grub.lock file from your system, which may be located in > > /usr/local/var/grub or perhaps somewhere else (do 'locate grub.lock' to > > find it). Then try to run the client again. If this doe not work, let me > > know. > > > > Cheers, > > > > Ozra. > > > > On Sun, 16 Feb 2003, Guillermo wrote: > > > > > > > > I managed to compile the package in my FreeBSD 4.7 box > > > (I had to change the -lpthread lines for -pthread for successful compilation) > > > > > > I did the make install, configured the grub.conf file and when trying to execute > > > I got the message: > > > > > > grub: Another instance is already running > > > > > > which is erroneus since I'm not running any grub process at all. > > > > > > Can you tell me what's wrong? > > > > > > Thanks. > > > > > > NiC > > > gu...@fa... > > > . > > > > > > > > > ------------------------------------------------------- > > > This sf.net email is sponsored by:ThinkGeek > > > Welcome to geek heaven. > > > http://thinkgeek.com/sf > > > _______________________________________________ > > > Grub-client mailing list > > > Gru...@li... > > > https://lists.sourceforge.net/lists/listinfo/grub-client > > > > > > > > > > > ------------------------------------------------------- > > This sf.net email is sponsored by:ThinkGeek > > Welcome to geek heaven. > > http://thinkgeek.com/sf > > _______________________________________________ > > Grub-client mailing list > > Gru...@li... > > https://lists.sourceforge.net/lists/listinfo/grub-client > > > > > > ------------------------------------------------------- > This SF.net email is sponsored by: SlickEdit Inc. Develop an edge. > The most comprehensive and flexible code editor you can use. > Code faster. C/C++, C#, Java, HTML, XML, many more. FREE 30-Day Trial. > www.slickedit.com/sourceforge > _______________________________________________ > Grub-client mailing list > Gru...@li... > https://lists.sourceforge.net/lists/listinfo/grub-client > |
From: Guillermo BT <gu...@fa...> - 2003-02-20 12:00:42
|
That's the same question I was going to do, because I'm running the Windows grubclient and when trying to run another client from a linux-box with same grub-user id, the figures doesn't change (it doesn't download a thing and it doesn't crawl at all). Guillermo BT gu...@fa... On Wed, 19 Feb 2003, Brian Heckathorne wrote: > Should i run more clients on many machines with the same login/password? > > > Brian > GJCN > > > > ------------------------------------------------------- > This SF.net email is sponsored by: SlickEdit Inc. Develop an edge. > The most comprehensive and flexible code editor you can use. > Code faster. C/C++, C#, Java, HTML, XML, many more. FREE 30-Day Trial. > www.slickedit.com/sourceforge > _______________________________________________ > Grub-client mailing list > Gru...@li... > https://lists.sourceforge.net/lists/listinfo/grub-client > |
From: Brian H. <Br...@GJ...> - 2003-02-19 19:46:26
|
Should i run more clients on many machines with the same login/password? Brian GJCN |
From: Brian H. <Br...@GJ...> - 2003-02-19 19:45:00
|
My request! An option for amount of URLs to grab would be nice. My machine eats up 500 URLs quickly. Brian GJCN |
From: Guillermo BT <gu...@fa...> - 2003-02-19 11:41:10
|
Hello, Now I'm getting this error (core dumped) when then client has run for a few minutes (3 or 4): #------------------------------------------------------------------------------# | Grub Distributed Network Crawling Agent Version 1.0.4 Feb 19, 2003 | | Tech Support Available: http://www.grub.org or Email us at su...@gr... | #------------------------------------------------------------------------------# | URL Set Completion (500 URLS) | | [################--------------------------------------------------------] | Fatal error 'siglongjmp()ing between thread contexts is undefined by POSIX 1003.1' at line ? in file /usr/src/lib/libc_r/uthread/uthread_jmp.c (errno = ?) Abort trap (core dumped) Is there any hints for this?? BTW: if its useful, this same computer also give me problems when using the distributed.net client (I suppose you know what I'm talking about) Guillermo BT. gu...@fa... On Tue, 18 Feb 2003, Igor Stojanovski wrote: > Guillermo, > > Try deleting grub.lock file from your system, which may be located in > /usr/local/var/grub or perhaps somewhere else (do 'locate grub.lock' to > find it). Then try to run the client again. If this doe not work, let me > know. > > Cheers, > > Ozra. > > On Sun, 16 Feb 2003, Guillermo wrote: > > > > > I managed to compile the package in my FreeBSD 4.7 box > > (I had to change the -lpthread lines for -pthread for successful compilation) > > > > I did the make install, configured the grub.conf file and when trying to execute > > I got the message: > > > > grub: Another instance is already running > > > > which is erroneus since I'm not running any grub process at all. > > > > Can you tell me what's wrong? > > > > Thanks. > > > > NiC > > gu...@fa... > > . > > > > > > ------------------------------------------------------- > > This sf.net email is sponsored by:ThinkGeek > > Welcome to geek heaven. > > http://thinkgeek.com/sf > > _______________________________________________ > > Grub-client mailing list > > Gru...@li... > > https://lists.sourceforge.net/lists/listinfo/grub-client > > > > > > ------------------------------------------------------- > This sf.net email is sponsored by:ThinkGeek > Welcome to geek heaven. > http://thinkgeek.com/sf > _______________________________________________ > Grub-client mailing list > Gru...@li... > https://lists.sourceforge.net/lists/listinfo/grub-client > |
From: Guillermo BT <gu...@fa...> - 2003-02-19 11:16:53
|
Thank you very much. That worked Igor!. Guillermo. gu...@fa... On Tue, 18 Feb 2003, Igor Stojanovski wrote: > Guillermo, > > Try deleting grub.lock file from your system, which may be located in > /usr/local/var/grub or perhaps somewhere else (do 'locate grub.lock' to > find it). Then try to run the client again. If this doe not work, let me > know. > > Cheers, > > Ozra. > > On Sun, 16 Feb 2003, Guillermo wrote: > > > > > I managed to compile the package in my FreeBSD 4.7 box > > (I had to change the -lpthread lines for -pthread for successful compilation) > > > > I did the make install, configured the grub.conf file and when trying to execute > > I got the message: > > > > grub: Another instance is already running > > > > which is erroneus since I'm not running any grub process at all. > > > > Can you tell me what's wrong? > > > > Thanks. > > > > NiC > > gu...@fa... > > . > > > > > > ------------------------------------------------------- > > This sf.net email is sponsored by:ThinkGeek > > Welcome to geek heaven. > > http://thinkgeek.com/sf > > _______________________________________________ > > Grub-client mailing list > > Gru...@li... > > https://lists.sourceforge.net/lists/listinfo/grub-client > > > > > > ------------------------------------------------------- > This sf.net email is sponsored by:ThinkGeek > Welcome to geek heaven. > http://thinkgeek.com/sf > _______________________________________________ > Grub-client mailing list > Gru...@li... > https://lists.sourceforge.net/lists/listinfo/grub-client > |
From: Igor S. <oz...@gr...> - 2003-02-18 18:28:48
|
Guillermo, Try deleting grub.lock file from your system, which may be located in /usr/local/var/grub or perhaps somewhere else (do 'locate grub.lock' to find it). Then try to run the client again. If this doe not work, let me know. Cheers, Ozra. On Sun, 16 Feb 2003, Guillermo wrote: > > I managed to compile the package in my FreeBSD 4.7 box > (I had to change the -lpthread lines for -pthread for successful compilation) > > I did the make install, configured the grub.conf file and when trying to execute > I got the message: > > grub: Another instance is already running > > which is erroneus since I'm not running any grub process at all. > > Can you tell me what's wrong? > > Thanks. > > NiC > gu...@fa... > . > > > ------------------------------------------------------- > This sf.net email is sponsored by:ThinkGeek > Welcome to geek heaven. > http://thinkgeek.com/sf > _______________________________________________ > Grub-client mailing list > Gru...@li... > https://lists.sourceforge.net/lists/listinfo/grub-client > |
From: Kord C. <ko...@gr...> - 2003-02-16 20:25:55
|
Hi all! A new release was posted on Friday, version 1.0.7. This build fixes the following bugs: - double crawl problem with the IE control - bandwidth sliders deadlock causing crash In addition, a server-side fix has been made to the scheduler, which is responsible for handing out URLs. If you look at the graph on our site, you'll see periods of time where URLs were not crawled. These "no crawl" times are attributed to software crashes in the scheduler, which results in URLs not being handed out to the clients. We are in the process of live testing the new fix for the server, and we will keep you posted on any problems that we are observing. Should you experience any difficulties in either the new release or connecting to the server, please let us know. Thanks for running the client! Kord -- -------------------------------------------------------------- Kord Campbell Grub, Inc. President 5500 North Western Avenue #101C Oklahoma City, OK 73118 ko...@gr... Voice: (405) 848-7000 http://www.grub.org Fax: (405) 848-5477 -------------------------------------------------------------- |
From: Kord C. <ko...@gr...> - 2003-02-16 20:18:23
|
Hi, This is a know issue with builds on BSD. We are working toward fixing functionality on BSD (especially for Jaguar) so that the client will compile nicely and run stable. We have problems mostly with the threads, but there are some issues with file access, such as this bug. I think that if you restart, the error will go away. If I recall, the error happens every other run cycle. Keep us posted. Thanks, Kord On Sun, 16 Feb 2003, Guillermo wrote: > > I managed to compile the package in my FreeBSD 4.7 box > (I had to change the -lpthread lines for -pthread for successful compilation) > > I did the make install, configured the grub.conf file and when trying to execute > I got the message: > > grub: Another instance is already running > > which is erroneus since I'm not running any grub process at all. > > Can you tell me what's wrong? > > Thanks. > > NiC > gu...@fa... > . > > > ------------------------------------------------------- > This sf.net email is sponsored by:ThinkGeek > Welcome to geek heaven. > http://thinkgeek.com/sf > _______________________________________________ > Grub-client mailing list > Gru...@li... > https://lists.sourceforge.net/lists/listinfo/grub-client > -- -------------------------------------------------------------- Kord Campbell Grub, Inc. President 5500 North Western Avenue #101C Oklahoma City, OK 73118 ko...@gr... Voice: (405) 848-7000 http://www.grub.org Fax: (405) 848-5477 -------------------------------------------------------------- |
From: Folkert v. H. <fo...@va...> - 2003-02-16 13:45:02
|
Hi, I have another suggestion: a time limit for the crawling process. Say, I want to crawl for no more then 5 minutes. Handy for people with a dial-up connection in countries where you pay per minute (the netherlands for example). Folkert van Heusden www.vanheusden.com |
From: Folkert v. H. <fo...@va...> - 2003-02-16 13:33:53
|
It would be nice if there were an option in grubclient to limit the crawling to certain URLs. Like: I only want to check the URLs on my local webserver (keetweej.vanheusden.com / vanheusden.com / testcentrumgouda.nl). Folkert. |
From: Guillermo <gu...@fa...> - 2003-02-16 12:48:45
|
I managed to compile the package in my FreeBSD 4.7 box (I had to change the -lpthread lines for -pthread for successful compilation) I did the make install, configured the grub.conf file and when trying to execute I got the message: grub: Another instance is already running which is erroneus since I'm not running any grub process at all. Can you tell me what's wrong? Thanks. NiC gu...@fa... . |
From: <sfo...@ea...> - 2003-02-13 23:05:39
|
Hi, I've been dutifully crawling my own site, installing the grub.txt, and periodically running the local scan from http://brainbug.grub.org/localcrawling/local_crawl.php, and still I see OTHER people's grub-clients crawling my site. What gives? What's the point of keeping my own site's database info updated if everybody else is just going to redo what I've done? Doesn't grub-client honor the presence of a grub.txt? Thanks in advance to anybody who can clarify this for me. Suzanne Forgach |
From: Kord C. <ko...@gr...> - 2003-02-04 16:11:24
|
Florian, Thanks for the feedback! The grub.conf file doesn't contain the path information, you have to put it on the command line when you start the client: grubclient -l <path to log/lock> -f <path to grub.conf> I'm assuming that you are running the client in the woody package, and if the above doesn't work, try compiling a new version from the tarball on our site. When you do compile, be sure that curl is in your library path for the linker to find. On some version of Linux (and most of BSDs), you need to do one of a couple of things to fix this. The first is to set an environmental variable that gives the path to the libraries, which you can do from the shell. LD_LIBRARY_PATH=/usr/local/lib/curl (for example) export LD_LIBRARY_PATH Your system may expect a different name for the library path, so check what it needs before you do this. The second method is to use "ldconfig" to specify additional directories for the linker to search when it is linking. Mileage will vary here, so do a man on ldconfig before proceeding. Some systems have the file ld.so.conf that you can edit to put in more directories. The third, and least desirable method, would be to copy the libraries into your "standard" library directories (/usr/lib and /usr/include). Purists and sysadmins will frown on this as it doesn't keep the system nice and neat and could potentially overwrite necessary libraries that were installed with the default system. I've done this on occasion, but I know what files I'm copying, and what I'm not, so be careful if you choose this method of fixing your problem. Hope this helps - thanks for running the client! Kord On 4 Feb 2003, Florian Moellers wrote: > hi, > > A) I'm running the grubclient on my debian (woody) pc and try to > configure it properly. > can I write the path to the log and lock files in the grub.conf file or > must I specify it on the command line? > Just putting the path at the end of the file causes an Syntax Error. > > B) when I tried to compile the new version of the client, the configure > script could not find the curl library, allthough it is installed and > the old client runs properly? > > regards > > florian > > > > > > ------------------------------------------------------- > This SF.NET email is sponsored by: > SourceForge Enterprise Edition + IBM + LinuxWorld = Something 2 See! > http://www.vasoftware.com > _______________________________________________ > Grub-client mailing list > Gru...@li... > https://lists.sourceforge.net/lists/listinfo/grub-client > -- -------------------------------------------------------------- Kord Campbell Grub, Inc. President 5500 North Western Avenue #101C Oklahoma City, OK 73118 ko...@gr... Voice: (405) 848-7000 http://www.grub.org Fax: (405) 848-5477 -------------------------------------------------------------- |
From: Florian M. <fo...@jp...> - 2003-02-04 00:20:26
|
hi, A) I'm running the grubclient on my debian (woody) pc and try to configure it properly. can I write the path to the log and lock files in the grub.conf file or must I specify it on the command line? Just putting the path at the end of the file causes an Syntax Error. B) when I tried to compile the new version of the client, the configure script could not find the curl library, allthough it is installed and the old client runs properly? regards florian |