You can subscribe to this list here.
| 2002 |
Jan
|
Feb
|
Mar
|
Apr
|
May
|
Jun
(23) |
Jul
(37) |
Aug
(13) |
Sep
(33) |
Oct
(37) |
Nov
(1) |
Dec
(12) |
|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 2003 |
Jan
(1) |
Feb
(7) |
Mar
(34) |
Apr
(41) |
May
(20) |
Jun
(13) |
Jul
(2) |
Aug
(20) |
Sep
(13) |
Oct
(8) |
Nov
(15) |
Dec
(32) |
| 2004 |
Jan
(65) |
Feb
(20) |
Mar
(29) |
Apr
(27) |
May
(37) |
Jun
(9) |
Jul
(7) |
Aug
(6) |
Sep
(16) |
Oct
|
Nov
(1) |
Dec
(18) |
| 2005 |
Jan
(18) |
Feb
(3) |
Mar
|
Apr
(14) |
May
|
Jun
|
Jul
|
Aug
(3) |
Sep
|
Oct
|
Nov
|
Dec
|
| 2006 |
Jan
|
Feb
|
Mar
(23) |
Apr
|
May
|
Jun
|
Jul
|
Aug
|
Sep
(4) |
Oct
|
Nov
|
Dec
|
| 2007 |
Jan
|
Feb
|
Mar
(2) |
Apr
|
May
|
Jun
(13) |
Jul
|
Aug
|
Sep
|
Oct
|
Nov
|
Dec
|
| 2008 |
Jan
|
Feb
|
Mar
|
Apr
|
May
(2) |
Jun
|
Jul
|
Aug
|
Sep
|
Oct
|
Nov
|
Dec
|
|
From: Adam G. <whi...@wh...> - 2003-08-19 08:45:06
|
I have just installed bobs 0.6.0 (from rpm.. worked well, as far as I can see) However, it took a bit of fiddling to make it work properly with my php version, 4.3.2 and it's php-dba extention, which apparently only included support for: Supported handlers gdbm cdb cdb_make db4 flatfile replacing all instances of 'db3' with 'db4', however, did not work... it killed the apache child each load of systemcheck.php... so, I tried 'gdbm'... the db4 problem may be in my installation (Mandrake Cooker RPMS of Apache&php&php mods). Will gdbm, or db4 (if I can get it to work at all) function properly for BOBS? System Info: Apache-AdvancedExtranetServer/2.0.47 (Mandrake Linux/4mdk) auth_mysql/1.11 mod_ssl/2.0.47 OpenSSL/0.9.7b DAV/2 PHP/4.3.2 core prefork http_core mod_so mod_access mod_auth mod_auth_anon mod_auth_digest mod_include mod_log_config mod_env mod_expires mod_headers mod_usertrack mod_setenvif mod_mime mod_status mod_autoindex mod_asis mod_info mod_cgi mod_vhost_alias mod_negotiation mod_dir mod_imap mod_actions mod_speling mod_userdir mod_alias mod_rewrite mod_auth_mysql mod_ssl mod_dav mod_dav_fs mod_suexec mod_php4 './configure' '--prefix=/usr' '--libdir=/usr/lib' '--enable-discard-path' '--disable-force-cgi-redirect' '--enable-shared' '--disable-static' '--disable-debug' '--disable-rpath' '--enable-pic' '--enable-inline-optimization' '--enable-memory-limit' '--with-config-file-path=/etc' '--with-config-file-scan-dir=/etc/php' '--with-pear=/usr/share/pear' '--enable-magic-quotes' '--enable-debugger' '--enable-track-vars' '--with-exec-dir=/usr/bin' '--with-versioning' '--with-mod_charset' '--with-regex=php' '--enable-track-vars' '--enable-trans-sid' '--enable-safe-mode' '--enable-ctype' '--enable-ftp' '--with-gettext=/usr' '--enable-posix' '--enable-session' '--enable-sysvsem' '--enable-sysvshm' '--enable-yp' '--with-openssl=/usr' '--without-kerberos' '--with-ttf' '--with-freetype-dir=/usr' '--with-zlib=/usr' '--with-zlib=/usr' '--with-zlib-dir=/usr' -- Adam Goldstein White Wolf Networks http://whitewlf.net |
|
From: James W. B. <jbe...@mi...> - 2003-08-14 16:51:27
|
Rene:
Thanks for the suggestion, explanation and the options! I'm considering the
programming option because I think its really useful. I'll let you know
what I come up with (if anything) before I do it.
James
----- Original Message -----
From: "Rene Rask" <re...@gr...>
To: <bob...@li...>
Sent: Thursday, August 14, 2003 9:45 AM
Subject: Re: [Bobs-devel] Periodic Cleanup of Backup Area
>
> This is an example usage of find:
>
> It will find all files modified more than 30 days ago which are larger
> than 5000KB and list them using ls long format (ls -l)
>
> find /home/rene/ -mtime +30 -size +5000k -type f -exec ls -l {} \;
>
> If you want to delete files you can replace "ls -l" with "rm -f" but be
> careful. You don't get any warnings.
>
> I usually test with "ls -l" before deleting. It gives an idea od what
> I'm deleting.
> You can also spice this up by only searching for files which end in
> ".avi" ".mpg" ".mov" and so on.
> Throw a bunch of these in a script file and use cron to execute it every
> night if you don't want to do it manually.
>
> Cheers
>
> Rene
>
> On Wed, 2003-08-13 at 21:54, James W. Beauchamp wrote:
> > Hi:
> > I have an administrative question concerning BOBS. Does someone have
some
> > sort of script that periodically runs on the BOBS backup area and
deletes
> > all files over X days old? It appears that something like this would be
> > nice since it keeps your hard disk from just filling up. Any chance of
> > something like this making it into the project? (self serving Feature
> > Request from a non-coder ;))
> >
> > If I'm missing something let me know.
> >
> > Thanks
> >
> > James
> >
> >
> >
> > -------------------------------------------------------
> > This SF.Net email sponsored by: Free pre-built ASP.NET sites including
> > Data Reports, E-commerce, Portals, and Forums are available now.
> > Download today and enter to win an XBOX or Visual Studio .NET.
> >
http://aspnet.click-url.com/go/psa00100003ave/direct;at.aspnet_072303_01/01
> > _______________________________________________
> > Bobs-devel mailing list
> > Bob...@li...
> > https://lists.sourceforge.net/lists/listinfo/bobs-devel
> --
> Rene Rask <re...@gr...>
>
>
>
> -------------------------------------------------------
> This SF.Net email sponsored by: Free pre-built ASP.NET sites including
> Data Reports, E-commerce, Portals, and Forums are available now.
> Download today and enter to win an XBOX or Visual Studio .NET.
>
http://aspnet.click-url.com/go/psa00100003ave/direct;at.aspnet_072303_01/01
> _______________________________________________
> Bobs-devel mailing list
> Bob...@li...
> https://lists.sourceforge.net/lists/listinfo/bobs-devel
|
|
From: Rene R. <re...@gr...> - 2003-08-14 13:56:27
|
This is an example usage of find:
It will find all files modified more than 30 days ago which are larger
than 5000KB and list them using ls long format (ls -l)
find /home/rene/ -mtime +30 -size +5000k -type f -exec ls -l {} \;
If you want to delete files you can replace "ls -l" with "rm -f" but be
careful. You don't get any warnings.
I usually test with "ls -l" before deleting. It gives an idea od what
I'm deleting.
You can also spice this up by only searching for files which end in
".avi" ".mpg" ".mov" and so on.
Throw a bunch of these in a script file and use cron to execute it every
night if you don't want to do it manually.
Cheers
Rene
On Wed, 2003-08-13 at 21:54, James W. Beauchamp wrote:
> Hi:
> I have an administrative question concerning BOBS. Does someone have some
> sort of script that periodically runs on the BOBS backup area and deletes
> all files over X days old? It appears that something like this would be
> nice since it keeps your hard disk from just filling up. Any chance of
> something like this making it into the project? (self serving Feature
> Request from a non-coder ;))
>
> If I'm missing something let me know.
>
> Thanks
>
> James
>
>
>
> -------------------------------------------------------
> This SF.Net email sponsored by: Free pre-built ASP.NET sites including
> Data Reports, E-commerce, Portals, and Forums are available now.
> Download today and enter to win an XBOX or Visual Studio .NET.
> http://aspnet.click-url.com/go/psa00100003ave/direct;at.aspnet_072303_01/01
> _______________________________________________
> Bobs-devel mailing list
> Bob...@li...
> https://lists.sourceforge.net/lists/listinfo/bobs-devel
--
Rene Rask <re...@gr...>
|
|
From: Rene R. <re...@gr...> - 2003-08-13 20:49:08
|
On Wed, 2003-08-13 at 21:54, James W. Beauchamp wrote: > Hi: > I have an administrative question concerning BOBS. Does someone have some > sort of script that periodically runs on the BOBS backup area and deletes > all files over X days old? It appears that something like this would be > nice since it keeps your hard disk from just filling up. Any chance of > something like this making it into the project? (self serving Feature > Request from a non-coder ;)) > You just hit the soft spot ;) I would love to include something like that and haave been thinking about it for a long time. Personal and work issues has kept me from working on this. I have been doing it manually. It always depends on the load the servers are under. Sometimes I just delete files of certain types, like quicktime and avi which are larger than say 100 MB in size and older than 30 days. Then if I need more space I delete those larger than 50 MB and older than 30 days and so on.. This is depends a lot on the work people are doing. I suggest you look at the "find" man pages for help with this. If you make a shell script for doing this please consider sending it to the list. Maybe I can use it for bobs. It may be included as an option when I have the resources to work with it. If you really want something good done, I suggest you try finding a kind programmer that will help you. I'll be happy to include it with bobs if it isn't a really ugle hack. If it works it can always be refined. Implementing a delete feature in bobs wouldn't be that hard. The search function could search for the files and an added "delete files" button could delete them. The problem would be one of security. A check would have to be made to ensure that only admins delete files. Once that is working it could be made part of the cron jobs. I would also guess that would be fairly easy. Cheers Rene |
|
From: James W. B. <jbe...@mi...> - 2003-08-13 20:13:19
|
Hi: I have an administrative question concerning BOBS. Does someone have some sort of script that periodically runs on the BOBS backup area and deletes all files over X days old? It appears that something like this would be nice since it keeps your hard disk from just filling up. Any chance of something like this making it into the project? (self serving Feature Request from a non-coder ;)) If I'm missing something let me know. Thanks James |
|
From: Rene R. <re...@gr...> - 2003-07-14 20:10:13
|
Hi Roger Could you please describe what you do step by step. The password shouldn't be affected by any dba problems since the password is not stored in a database. I hope I can be of more help then, or that somebody else can. Cheers Rene On Thu, 2003-07-10 at 18:38, Roger Persson wrote: > Hi All, > > Just downloaded the bobs software and installed, looks perfect for what > i want to do. I havent gotten very far though because i cant login to > the admin pages no matter what i set in admin.pwd. I looked through the > php files and i think it has something to do with the db settings. When > i login with the server.share.ini i created i get the following message: > > Warning: driver initialization failed in > /var/www/html/bobs/inc/class_db.php on line 66 > > I checked my php setup for dba support and according to phpinfo.php its > installed and enabled. > > Any ideas? > > Best regards, > > Roger Persson > JMS Digitalprint > 040-427722 > 0708-424081 > rog...@jm... > > > > ------------------------------------------------------- > This SF.Net email sponsored by: Parasoft > Error proof Web apps, automate testing & more. > Download & eval WebKing and get a free book. > www.parasoft.com/bulletproofapps > _______________________________________________ > Bobs-devel mailing list > Bob...@li... > https://lists.sourceforge.net/lists/listinfo/bobs-devel |
|
From: Roger P. <ro...@di...> - 2003-07-10 16:37:26
|
Hi All, Just downloaded the bobs software and installed, looks perfect for what i want to do. I havent gotten very far though because i cant login to the admin pages no matter what i set in admin.pwd. I looked through the php files and i think it has something to do with the db settings. When i login with the server.share.ini i created i get the following message: Warning: driver initialization failed in /var/www/html/bobs/inc/class_db.php on line 66 I checked my php setup for dba support and according to phpinfo.php its installed and enabled. Any ideas? Best regards, Roger Persson JMS Digitalprint 040-427722 0708-424081 rog...@jm... |
|
From: Joe Z. <jz...@at...> - 2003-06-28 07:01:20
|
James W. Beauchamp wrote: >Joe: >It's all working like a charm. I have to say this is the neatest piece of >software I've seen in awhile. >Many thanks to all who are working on it. I am continually amazed at the >creativity of all the folks that work on open source software. >This app. in particular lets you take an old box that you were maybe going >to throw away, throw in some new disks and just let it sit there for when >its needed. It is much easier to pull a file off this for a user than to >pull it from tape. I've done it once or twice now for people and they can't >believe it when I restore the file while I'm on the phone with them! I'm >sure this is old hat to everyone else, but its kind of neat to me. :)) > > >Thanks again. > >James > > Thank you James! Feedback like this makes it all worthwhile. I worked on the admin interface and packaging, but Rene Rask deserves all the credit for writing bobs. Joe |
|
From: James W. B. <jbe...@mi...> - 2003-06-26 13:57:03
|
Joe: It's all working like a charm. I have to say this is the neatest piece of software I've seen in awhile. Many thanks to all who are working on it. I am continually amazed at the creativity of all the folks that work on open source software. This app. in particular lets you take an old box that you were maybe going to throw away, throw in some new disks and just let it sit there for when its needed. It is much easier to pull a file off this for a user than to pull it from tape. I've done it once or twice now for people and they can't believe it when I restore the file while I'm on the phone with them! I'm sure this is old hat to everyone else, but its kind of neat to me. :)) Thanks again. James ----- Original Message ----- From: "Joe Zacky" <jz...@at...> To: <bob...@li...> Sent: Wednesday, June 25, 2003 10:29 PM Subject: Re: [Bobs-devel] Install Question > James W. Beauchamp wrote: > > >O.K. > >I think I solved it but here is where I made a mistake. I think its because > >I was confused slightly by the setup of the share. On the setup screen for > >a new share you are asked the name of the server and the share > >identification. I took this to mean 'a text description of the share'. So > >I typed in 'Admin Directory'. However, I did not realize that this would be > >used as the name of the directory created under > >/backup/bobsdata/current/process/mounts/main. where 'main' is my server > >name. The tip was that the directory was named 'Admin Directory' just like > >I had typed in as a description. When I ran the first script by hand as > >described below it was choking on mounting that directory. So I went in and > >deleted that share, created it anew using just the word 'admin' as a > >description and now it passes he configuration test with no errors (except > >that a backup doesn't exist). > > > >So I guess I'm dense or maybe a slight clarification is needed on that entry > >box for people like me (non hackers ;) ) > > > >Bottom line is I think its working....I guess I'll know in 24 hours.... > >If you'd like me to test something else on this let me know.... [or just > >quietly go away :) ] > > > >Thanks > > > >James > > > > > > > I'll change that text to say 'share name' instead of description. You're > right that it's misleading, if not just plain incorrect. > > So did your backup work? > > Joe > > > > ------------------------------------------------------- > This SF.Net email is sponsored by: INetU > Attention Web Developers & Consultants: Become An INetU Hosting Partner. > Refer Dedicated Servers. We Manage Them. You Get 10% Monthly Commission! > INetU Dedicated Managed Hosting http://www.inetu.net/partner/index.php > _______________________________________________ > Bobs-devel mailing list > Bob...@li... > https://lists.sourceforge.net/lists/listinfo/bobs-devel |
|
From: Joe Z. <jz...@at...> - 2003-06-26 02:29:33
|
James W. Beauchamp wrote: >O.K. >I think I solved it but here is where I made a mistake. I think its because >I was confused slightly by the setup of the share. On the setup screen for >a new share you are asked the name of the server and the share >identification. I took this to mean 'a text description of the share'. So >I typed in 'Admin Directory'. However, I did not realize that this would be >used as the name of the directory created under >/backup/bobsdata/current/process/mounts/main. where 'main' is my server >name. The tip was that the directory was named 'Admin Directory' just like >I had typed in as a description. When I ran the first script by hand as >described below it was choking on mounting that directory. So I went in and >deleted that share, created it anew using just the word 'admin' as a >description and now it passes he configuration test with no errors (except >that a backup doesn't exist). > >So I guess I'm dense or maybe a slight clarification is needed on that entry >box for people like me (non hackers ;) ) > >Bottom line is I think its working....I guess I'll know in 24 hours.... >If you'd like me to test something else on this let me know.... [or just >quietly go away :) ] > >Thanks > >James > > > I'll change that text to say 'share name' instead of description. You're right that it's misleading, if not just plain incorrect. So did your backup work? Joe |
|
From: James W. B. <jbe...@mi...> - 2003-06-24 18:55:11
|
Arnaud: I've seen this quite a number of times in the past when using smbmount to mount shares on a Windoze box. If you look at the archives for samba you'll see many references to this in relation to backups and backup software in general. I believe there is a bug in smbmount that causes this IIRC and I don't believe there is much you can do about it. If I've misspoken I'm sure somone will correct me. HTH James ----- Original Message ----- From: "Arnaud CONCHON" <arn...@ar...> To: <bob...@li...> Sent: Tuesday, June 24, 2003 2:21 PM Subject: RE: [Bobs-devel] Backup files from a NT4 server... > Hi, > I've been using BOBS for some time now and there is something strange i've > been noticing... > I run bobs on a linux RH9 srv with large disks which backups several servers > such as NT4 SRV servers, and Linux servers running samba. I use the smb way > for all the backups... (why not NFS would you say?... i didn't have time to > set this up, and moreover the smb way works just fine...). But speaking of > the NT4 servers, i've noticed lots of backups done everyday, lots of things > in the incremental folder... After a little research, i saw that the backup > server saved lots of files on the NT4 server whereas neither the file's > content change, nor the modification time. BUT, it seems that my backup > server saves files which have been lately accessed, and on which the "last > accessed time" changed. For a reminder, on a NTFS time you have 3 times: > -creation time > -modification time > -last access time > > you can never see the "last access time" because windows updates this one > each time you access the file... and in my backups, the files in current and > in incremental are strictly the same (same name, same creation/modif > time)... Am I the only one with this issue? Is there a way to prevent this > to happen (except kicking NT4 away for linux based servers ;) ) > > Bye > Arnaud. > > ---------------------------------------------------------------------------- ---- ---------------------------------------------------------------------------- --------------- Ce message et toutes les pièces jointes sont confidentiels et établis à l'attention exclusive de ses destinataires. Toute utilisation de ce message non conforme à sa destination, toute diffusion ou toute publication, totale ou partielle, est interdite, sauf autorisation expresse. Si vous recevez ce message par erreur, merci de le détruire et d'en avertir immédiatement l'expéditeur. Le Groupe C3D, ses filiales et ses partenaires déclinent toute responsabilité au titre de ce message s'il a été altéré, déformé ou falsifié. This message and all attachments are confidential and intended solely for the addressees. Any use not in accord with its purpose, any dissemination or disclosure, either whole or partial, is prohibited except formal approval. If you receive this message in error, please delete it and immediately notify the sender. Neither C3D Group nor any of its subsidiaries or affiliates shall be liable for the message if altered, changed or falsified. ---------------------------------------------------------------------------- --------------- |
|
From: Arnaud C. <arn...@ar...> - 2003-06-24 18:22:07
|
Hi, I've been using BOBS for some time now and there is something strange i've been noticing... I run bobs on a linux RH9 srv with large disks which backups several servers such as NT4 SRV servers, and Linux servers running samba. I use the smb way for all the backups... (why not NFS would you say?... i didn't have time to set this up, and moreover the smb way works just fine...). But speaking of the NT4 servers, i've noticed lots of backups done everyday, lots of things in the incremental folder... After a little research, i saw that the backup server saved lots of files on the NT4 server whereas neither the file's content change, nor the modification time. BUT, it seems that my backup server saves files which have been lately accessed, and on which the "last accessed time" changed. For a reminder, on a NTFS time you have 3 times: -creation time -modification time -last access time you can never see the "last access time" because windows updates this one each time you access the file... and in my backups, the files in current and in incremental are strictly the same (same name, same creation/modif time)... Am I the only one with this issue? Is there a way to prevent this to happen (except kicking NT4 away for linux based servers ;) ) Bye Arnaud. |
|
From: James W. B. <jbe...@mi...> - 2003-06-24 18:21:09
|
O.K. I think I solved it but here is where I made a mistake. I think its because I was confused slightly by the setup of the share. On the setup screen for a new share you are asked the name of the server and the share identification. I took this to mean 'a text description of the share'. So I typed in 'Admin Directory'. However, I did not realize that this would be used as the name of the directory created under /backup/bobsdata/current/process/mounts/main. where 'main' is my server name. The tip was that the directory was named 'Admin Directory' just like I had typed in as a description. When I ran the first script by hand as described below it was choking on mounting that directory. So I went in and deleted that share, created it anew using just the word 'admin' as a description and now it passes he configuration test with no errors (except that a backup doesn't exist). So I guess I'm dense or maybe a slight clarification is needed on that entry box for people like me (non hackers ;) ) Bottom line is I think its working....I guess I'll know in 24 hours.... If you'd like me to test something else on this let me know.... [or just quietly go away :) ] Thanks James ----- Original Message ----- From: "Arnaud CONCHON" <arn...@ar...> To: <jbe...@mi...> Sent: Tuesday, June 24, 2003 1:52 PM Subject: TR: [Bobs-devel] Install Question > > > > -----Message d'origine----- > > De : bob...@li... > > [mailto:bob...@li...]De la part de Joe Zacky > > Envoyé : mardi 24 juin 2003 04:55 > > À : bob...@li... > > Objet : Re: [Bobs-devel] Install Question > > > > > > James W. Beauchamp wrote: > > > > > Hi all: > > > I've installed BOBS on a Fresh RH 9.0 setup. It's running but I'm not > > > getting backups. on the Current tab in the upper right corner I have > > > "185718904.Preparation failed (mount) > > > 130810167.Preparing backup > > > 130810167.Preparation failed (mount) > > > > > > When I run the configuration check on the admin screen I get Pass for > > > everything except this: > > > > > > Attempting to write, read and delete a file on > > > /backup/bobsdata/current/process/mounts/main/Admin Directory FAIL > > > Write file test failed with these messages: > > > > > > file already exists. > > > > > > > > > > > > (server name is 'main' and the directory mounted via NFS is 'Admin') > > > > > > The messges in the server logs show the successfull authentication and > > >mount > > > and unmount of the directory as BOBS does it so there isn't an > > error there. > > > Do I have some kind of permission/user/owenership issue? > > > > > > Any help is appreciated. I'm sure it's a stupid mistake on my > > part since > > > this appears to work for most people out of the box. > > > > > > James > > > > > > > > > James W. Beauchamp, P.E. > > > > > > > > That's a tough one. "Check Configuration" says it can mount and unmount > > okay, but the mount command during the backup fails - that's where the > > "Preparation failed (mount)" message comes from. > > > > Let's start with the "file already exists." error in Check > > Configuration. The test is to write a file named "bobswritetest.out" and > > then delete it. See if that file exists on the 'main' computer, delete > > it if it does. Then on the bobs computer, with the Admin nfs share > > un-mounted, see if that file exists in > > /backup/bobsdata/current/process/mounts/main/Admin and delete it if it > > does. Then, copy the mount command from the Check Configuration page and > > manually run it as root. Then 'touch > > /backup/bobsdata/current/process/mounts/main/Admin/bobstest.out' and see > > if you get an error. If the write is successful, delete the file and > > unmount the share. > > > > I notice your data directory is /backup/bobsdata instead of > > /var/bobsdata. I assume you used './configure --with-bobsdata=/backup'. > > That should be okay. > > > > If you want to debug the actual backup, here's what you do (I hope I > > don't leave out something). > > As root: > > > > * Comment out the check_loop line in /etc/crontab with a #. > > * Kill the cmdloop process: 'kill $(pidof -x cmdloop)' > > * Run the backup: /etc/cron.daily/backup.php > > * cd /backup/bobsdata/current/process/cmd. You should have several > > scripts sitting there waiting to run. They were created by > > backup.php. > > * Run each script in numerical order with ./script_name. For the > > bash scripts, if you want to see the commands being run you can > > run them as 'sh -xv script_name'. With patience and careful > > looking you should be able to spot the point of failure and see > > what the error message is. > > > > Please let me know how it goes. > > > > Joe > > > > > > > > > > ------------------------------------------------------- > > This SF.Net email is sponsored by: INetU > > Attention Web Developers & Consultants: Become An INetU Hosting Partner. > > Refer Dedicated Servers. We Manage Them. You Get 10% Monthly Commission! > > INetU Dedicated Managed Hosting http://www.inetu.net/partner/index.php > > _______________________________________________ > > Bobs-devel mailing list > > Bob...@li... > > https://lists.sourceforge.net/lists/listinfo/bobs-devel > > > > ---------------------------------------------------------------------------- ---- ---------------------------------------------------------------------------- --------------- Ce message et toutes les pièces jointes sont confidentiels et établis à l'attention exclusive de ses destinataires. Toute utilisation de ce message non conforme à sa destination, toute diffusion ou toute publication, totale ou partielle, est interdite, sauf autorisation expresse. Si vous recevez ce message par erreur, merci de le détruire et d'en avertir immédiatement l'expéditeur. Le Groupe C3D, ses filiales et ses partenaires déclinent toute responsabilité au titre de ce message s'il a été altéré, déformé ou falsifié. This message and all attachments are confidential and intended solely for the addressees. Any use not in accord with its purpose, any dissemination or disclosure, either whole or partial, is prohibited except formal approval. If you receive this message in error, please delete it and immediately notify the sender. Neither C3D Group nor any of its subsidiaries or affiliates shall be liable for the message if altered, changed or falsified. ---------------------------------------------------------------------------- --------------- |
|
From: James W. B. <jbe...@mi...> - 2003-06-24 17:47:35
|
Hi all: Thanks for the response but I was on a different pc at home and wouldn't you know it I deleted the responses thinking they were saved at work. Sooooo I went to the archives and they're not there yet, so do you mind kindly forwarding them to me a second time. I promise to handle them correctly this time..... Call me embarrassed....(I really have too many email accounts, but then again who doesn't...) James James W. Beauchamp, P.E. 2121 Newmarket Pkwy. Suite 140 Marietta, GA 30067 phone - 770-690-9552 ext. 227 fax - 770-690-9529 www.gesinc.com |
|
From: Joe Z. <jz...@at...> - 2003-06-24 02:55:29
|
James W. Beauchamp wrote:
> Hi all:
> I've installed BOBS on a Fresh RH 9.0 setup. It's running but I'm not
> getting backups. on the Current tab in the upper right corner I have
> "185718904.Preparation failed (mount)
> 130810167.Preparing backup
> 130810167.Preparation failed (mount)
>
> When I run the configuration check on the admin screen I get Pass for
> everything except this:
>
> Attempting to write, read and delete a file on
> /backup/bobsdata/current/process/mounts/main/Admin Directory FAIL
> Write file test failed with these messages:
>
> file already exists.
>
>
>
> (server name is 'main' and the directory mounted via NFS is 'Admin')
>
> The messges in the server logs show the successfull authentication and
>mount
> and unmount of the directory as BOBS does it so there isn't an error there.
> Do I have some kind of permission/user/owenership issue?
>
> Any help is appreciated. I'm sure it's a stupid mistake on my part since
> this appears to work for most people out of the box.
>
> James
>
>
> James W. Beauchamp, P.E.
>
>
That's a tough one. "Check Configuration" says it can mount and unmount
okay, but the mount command during the backup fails - that's where the
"Preparation failed (mount)" message comes from.
Let's start with the "file already exists." error in Check
Configuration. The test is to write a file named "bobswritetest.out" and
then delete it. See if that file exists on the 'main' computer, delete
it if it does. Then on the bobs computer, with the Admin nfs share
un-mounted, see if that file exists in
/backup/bobsdata/current/process/mounts/main/Admin and delete it if it
does. Then, copy the mount command from the Check Configuration page and
manually run it as root. Then 'touch
/backup/bobsdata/current/process/mounts/main/Admin/bobstest.out' and see
if you get an error. If the write is successful, delete the file and
unmount the share.
I notice your data directory is /backup/bobsdata instead of
/var/bobsdata. I assume you used './configure --with-bobsdata=/backup'.
That should be okay.
If you want to debug the actual backup, here's what you do (I hope I
don't leave out something).
As root:
* Comment out the check_loop line in /etc/crontab with a #.
* Kill the cmdloop process: 'kill $(pidof -x cmdloop)'
* Run the backup: /etc/cron.daily/backup.php
* cd /backup/bobsdata/current/process/cmd. You should have several
scripts sitting there waiting to run. They were created by backup.php.
* Run each script in numerical order with ./script_name. For the
bash scripts, if you want to see the commands being run you can
run them as 'sh -xv script_name'. With patience and careful
looking you should be able to spot the point of failure and see
what the error message is.
Please let me know how it goes.
Joe
|
|
From: James W. B. <jbe...@mi...> - 2003-06-23 13:56:08
|
Hi all:
I've installed BOBS on a Fresh RH 9.0 setup. It's running but I'm not
getting backups. on the Current tab in the upper right corner I have
"185718904.Preparation failed (mount)
130810167.Preparing backup
130810167.Preparation failed (mount)
When I run the configuration check on the admin screen I get Pass for
everything except this:
Attempting to write, read and delete a file on
/backup/bobsdata/current/process/mounts/main/Admin Directory FAIL
Write file test failed with these messages:
file already exists.
(server name is 'main' and the directory mounted via NFS is 'Admin')
The messges in the server logs show the successfull authentication and
mount
and unmount of the directory as BOBS does it so there isn't an error there.
Do I have some kind of permission/user/owenership issue?
Any help is appreciated. I'm sure it's a stupid mistake on my part since
this appears to work for most people out of the box.
James
James W. Beauchamp, P.E.
2121 Newmarket Pkwy.
Suite 140
Marietta, GA 30067
phone - 770-690-9552 ext. 227
fax - 770-690-9529
www.gesinc.com
|
|
From: Joe Z. <jz...@at...> - 2003-06-11 01:48:59
|
D'Arcey Carroll wrote:
>This is my second try with this message. I've "successfully subscribed" 2 or
>3 times, but only 1 of my messages has made it to the list, and I never
>receive any.
>
>Please read the details and any associated ("duplicate") bugs related to
>RedHat Bugzilla #90036:
>
>http://bugzilla.redhat.com/bugzilla/show_bug.cgi?id=90036
>
>pointedly referenced in RedHat Bugzilla #82820:
>
>http://bugzilla.redhat.com/bugzilla/show_bug.cgi?id=82820
>
>
>It seems that there is no reliable workaround, but that you could
>theoretically launch the smbmount in the background (mount -t smbfs ... &),
>note the PID, and 'kill -TERM' the process later. Perhaps a 'umount' would
>properly kill it??? Based on the bug description, it seems as though the
>mounter should be functioning properly, it just cannot properly daemonize
>and return.
>
>Thoughts?
>
>DC
>
>
>
I read your related message on June 6th. I couldn't find the bug report
on bugzilla when I tried earlier, I'm glad you found it. It matches the
symptoms I've observed. When it hangs, there are 2 smbmount processes,
one forked from the other. I have to kill the child (the parent too? I
don't remember) and then smbmount terminates okay.
To try to do that in bobs would likely be sloppy. I'd have to be careful
not to kill other smbmount processes. The right way is to wait for glibc
to be fixed, but I couldn't tell when that will be from looking at the
bug report.
Joe
PS - I'm CC'ing you directly in case you still have problems receiving
the list messages.
|
|
From: D'Arcey C. <dlc...@be...> - 2003-06-10 11:23:11
|
This is my second try with this message. I've "successfully subscribed" 2 or
3 times, but only 1 of my messages has made it to the list, and I never
receive any.
Please read the details and any associated ("duplicate") bugs related to
RedHat Bugzilla #90036:
http://bugzilla.redhat.com/bugzilla/show_bug.cgi?id=90036
pointedly referenced in RedHat Bugzilla #82820:
http://bugzilla.redhat.com/bugzilla/show_bug.cgi?id=82820
It seems that there is no reliable workaround, but that you could
theoretically launch the smbmount in the background (mount -t smbfs ... &),
note the PID, and 'kill -TERM' the process later. Perhaps a 'umount' would
properly kill it??? Based on the bug description, it seems as though the
mounter should be functioning properly, it just cannot properly daemonize
and return.
Thoughts?
DC
|
|
From: Joe Z. <jz...@at...> - 2003-06-07 01:05:56
|
D'Arcey Carroll wrote: >Please see RedHat Bugzilla Bug #90036 > >http://bugzilla.redhat.com/bugzilla/show_bug.cgi?id=90036 > >The essence is a race condition in glibc where fork() is not signal-safe. > > > Good job tracking that down. That's my symptoms exactly. By the way, I'm planning to package release 0.6.0 this weekend. Joe |
|
From: D'Arcey C. <dlc...@be...> - 2003-06-06 20:27:33
|
Please see RedHat Bugzilla Bug #90036 http://bugzilla.redhat.com/bugzilla/show_bug.cgi?id=90036 The essence is a race condition in glibc where fork() is not signal-safe. |
|
From: Rene R. <re...@gr...> - 2003-05-29 14:52:33
|
Don't trust the date the files have on the filesystem. That is not very useful. BOBS adds a datestamp to the filename to keep track of dates. The dates of file in "current" should reflect the dates of the same files on your server.. Try looking at the actual file contents and you will see what I mean. "Current" is always a mirror of your server. "Incremental" is storage for changes in the mirror. Cheers Rene On Wed, 2003-05-28 at 13:50, Arnaud CONCHON wrote: > > On Wed, 2003-05-21 at 09:44, Arnaud CONCHON wrote: > > > Hi, > > > I'm new to using this piece of software and there's a question > > i was asking > > > myself. Is it possible to define a day in the week to do a > > complete backup > > > whereas the other days, it will only do incremental backups ? > > > > > > You will have to keep a "current" backup at all times. That means that > > you will need as much diskspace for the "current" backup as you have > > files on your server. > > If you have 100 GB worth of files you are making backups of, you will > > also need at least 100 GB diskspace for the "current" backup. You need > > more space to keep incremental backup. > > > > "Current" is always a snapshot of the files at the time of backup. > > > > You cannot make incremental backup without keeping "current" up to date. > > "incremental" is nothing but the changes that happen to files in > > "current". > > In my backups, the Current is always the first backups and the incremental is the new version of the files that are in the Current. Which means that if i restore the Current backups, I won't have the last versions of the files. Wouln't it be more interesting if the Current backups were always the last versions of every files, and in the incremental, the old versions of theses files (with no file if there are no change on the files...) > To make things clearer, I have for example in the current backup a file: > file1.exe 10 may 2003 > > and in incremental > file1.exe 12 may 2003 > file1.exe 15 may 2003 > > Why don't we have instead this : > > Current: > File1.exe 15 may 2003 > > Incremental: > > file1.exe 10 may 2003 > file1.exe 12 may 2003 > > Which would make the CURRENT backup always the LAST versions, and in incremental, the past versions of the files. The reason behind this is if one day one of my servers crash and i want to restore all the files by copying the Current folder to the new server, i dont want to overwrite after all the files with the incremental files... i should only have to copy the current files, and in case i need an older file, i can have a look in the incremental files... > What do you think about this? > > Best Regards. > > > ______________________________________________________________________ > > ------------------------------------------------------------------------------------------- > Ce message et toutes les pièces jointes sont confidentiels et établis à l'attention exclusive de ses destinataires. > Toute utilisation de ce message non conforme à sa destination, toute diffusion ou toute publication, totale ou partielle, est interdite, sauf autorisation expresse. > Si vous recevez ce message par erreur, merci de le détruire et d'en avertir immédiatement l'expéditeur. > Le Groupe C3D, ses filiales et ses partenaires déclinent toute responsabilité au titre de ce message s'il a été altéré, déformé ou falsifié. > > This message and all attachments are confidential and intended solely for the addressees. > Any use not in accord with its purpose, any dissemination or disclosure, either whole or partial, is prohibited except formal approval. > If you receive this message in error, please delete it and immediately notify the sender. > Neither C3D Group nor any of its subsidiaries or affiliates shall be liable for the message if altered, changed or falsified. > ------------------------------------------------------------------------------------------- -- Rene Rask <re...@gr...> |
|
From: Arnaud C. <arn...@ar...> - 2003-05-28 11:51:05
|
> On Wed, 2003-05-21 at 09:44, Arnaud CONCHON wrote: > > Hi, > > I'm new to using this piece of software and there's a question=20 > i was asking > > myself. Is it possible to define a day in the week to do a=20 > complete backup > > whereas the other days, it will only do incremental backups ? >=20 >=20 > You will have to keep a "current" backup at all times. That means that > you will need as much diskspace for the "current" backup as you have > files on your server. > If you have 100 GB worth of files you are making backups of, you will > also need at least 100 GB diskspace for the "current" backup. You need > more space to keep incremental backup. >=20 > "Current" is always a snapshot of the files at the time of backup. >=20 > You cannot make incremental backup without keeping "current" up to = date. > "incremental" is nothing but the changes that happen to files in > "current". In my backups, the Current is always the first backups and the = incremental is the new version of the files that are in the Current. = Which means that if i restore the Current backups, I won't have the last = versions of the files. Wouln't it be more interesting if the Current = backups were always the last versions of every files, and in the = incremental, the old versions of theses files (with no file if there are = no change on the files...) To make things clearer, I have for example in the current backup a file: file1.exe 10 may 2003 and in incremental file1.exe 12 may 2003 file1.exe 15 may 2003 Why don't we have instead this : Current:=20 File1.exe 15 may 2003 Incremental: file1.exe 10 may 2003 file1.exe 12 may 2003 Which would make the CURRENT backup always the LAST versions, and in = incremental, the past versions of the files. The reason behind this is = if one day one of my servers crash and i want to restore all the files = by copying the Current folder to the new server, i dont want to = overwrite after all the files with the incremental files... i should = only have to copy the current files, and in case i need an older file, i = can have a look in the incremental files... What do you think about this? Best Regards. |
|
From: Rene R. <re...@gr...> - 2003-05-27 07:52:28
|
Hmm. There is a warning on the search page. I'll have a look at removing that. The search function works so it isn't a showstopper. Rene On Mon, 2003-05-26 at 19:43, Joe Zacky wrote: > Rene Rask wrote: > > >My backup server is now running 0.6.0pre2 (upgraded from 0.5.1) and I'll > >keep an eye on it to see if any bugs crops up. > >I fixed a couple of bugs and committed them to cvs. > > > >Joe, do you have anything you want to add or fix before releasing 0.6.0? > > > >I think I'll work on automatic file deletion for the 0.7.0 release. > > > >Cheers > > > > > > > I think the next important thing I'd like to work on is the option of > logging backup and restore to log files and optionally emailing backup > logs to the admin. But these aren't bugs, and I'm not aware of any major > bugs, so I'm cool with releasing 0.6.0. > > Joe > > > > ------------------------------------------------------- > This SF.net email is sponsored by: ObjectStore. > If flattening out C++ or Java code to make your application fit in a > relational database is painful, don't do it! Check out ObjectStore. > Now part of Progress Software. http://www.objectstore.net/sourceforge > _______________________________________________ > Bobs-devel mailing list > Bob...@li... > https://lists.sourceforge.net/lists/listinfo/bobs-devel -- Rene Rask <re...@gr...> |
|
From: Rene R. <re...@gr...> - 2003-05-27 07:46:53
|
The first backup run seems to have passed with no problems, so I'm still happy. I'll leave the relase date of 0.6.0 to you. If I find bugs we can fix them in a 0.6.1 release. Cheers Rene On Mon, 2003-05-26 at 19:43, Joe Zacky wrote: > Rene Rask wrote: > > >My backup server is now running 0.6.0pre2 (upgraded from 0.5.1) and I'll > >keep an eye on it to see if any bugs crops up. > >I fixed a couple of bugs and committed them to cvs. > > > >Joe, do you have anything you want to add or fix before releasing 0.6.0? > > > >I think I'll work on automatic file deletion for the 0.7.0 release. > > > >Cheers > > > > > > > I think the next important thing I'd like to work on is the option of > logging backup and restore to log files and optionally emailing backup > logs to the admin. But these aren't bugs, and I'm not aware of any major > bugs, so I'm cool with releasing 0.6.0. > > Joe > > > > ------------------------------------------------------- > This SF.net email is sponsored by: ObjectStore. > If flattening out C++ or Java code to make your application fit in a > relational database is painful, don't do it! Check out ObjectStore. > Now part of Progress Software. http://www.objectstore.net/sourceforge > _______________________________________________ > Bobs-devel mailing list > Bob...@li... > https://lists.sourceforge.net/lists/listinfo/bobs-devel -- Rene Rask <re...@gr...> |
|
From: Joe Z. <jz...@at...> - 2003-05-26 17:43:56
|
Rene Rask wrote: >My backup server is now running 0.6.0pre2 (upgraded from 0.5.1) and I'll >keep an eye on it to see if any bugs crops up. >I fixed a couple of bugs and committed them to cvs. > >Joe, do you have anything you want to add or fix before releasing 0.6.0? > >I think I'll work on automatic file deletion for the 0.7.0 release. > >Cheers > > > I think the next important thing I'd like to work on is the option of logging backup and restore to log files and optionally emailing backup logs to the admin. But these aren't bugs, and I'm not aware of any major bugs, so I'm cool with releasing 0.6.0. Joe |