From: Mike W. <txt...@gm...> - 2011-01-31 05:48:13
|
I recently got a gedcom file that belonged to my cousin. Her son made it for me from her genealogy database. After her death he wanted me to maintain the file to help preserve the family tree. My question concerns importing her file into my database. We have shared data for a number of years so some of the data will be duplicated. What is the best way to handle the duplicate entries? I tried the find possible duplicate people tool, but it didn't seem very accurate. -- Mike White Lorena, Texas |
From: Peter L. <pet...@te...> - 2011-01-31 09:16:54
|
Den Monday 31 January 2011 06.48.03 skrev Mike White: > I recently got a gedcom file that belonged to my cousin. Her son made it > for me from her genealogy database. After her death he wanted me to > maintain the file to help preserve the family tree. My question concerns > importing her file into my database. We have shared data for a number of > years so some of the data will be duplicated. What is the best way to > handle the duplicate entries? I tried the find possible duplicate people > tool, but it didn't seem very accurate. What version of Gramps do you use? There was an error in 3.2.5 that missed obvious duplicates. This has been fixed now. /Peter |
From: Mike W. <txt...@gm...> - 2011-01-31 17:36:51
|
On 01/31/2011 03:19 AM, Peter Landgren wrote: > Den Monday 31 January 2011 06.48.03 skrev Mike White: >> I recently got a gedcom file that belonged to my cousin. Her son made it >> for me from her genealogy database. After her death he wanted me to >> maintain the file to help preserve the family tree. My question concerns >> importing her file into my database. We have shared data for a number of >> years so some of the data will be duplicated. What is the best way to >> handle the duplicate entries? I tried the find possible duplicate people >> tool, but it didn't seem very accurate. > What version of Gramps do you use? > There was an error in 3.2.5 that missed obvious duplicates. > This has been fixed now. > > /Peter I'm using version 3.2.3.1. When I ran the tool it found many duplicates that were not duplicates. Some were pretty far off. Some were close. -- Mike White Lorena, Texas |
From: Peter L. <pet...@te...> - 2011-01-31 17:47:33
|
Den Monday 31 January 2011 18.36.36 skrev Mike White: > On 01/31/2011 03:19 AM, Peter Landgren wrote: > > Den Monday 31 January 2011 06.48.03 skrev Mike White: > >> I recently got a gedcom file that belonged to my cousin. Her son made it > >> for me from her genealogy database. After her death he wanted me to > >> maintain the file to help preserve the family tree. My question concerns > >> importing her file into my database. We have shared data for a number of > >> years so some of the data will be duplicated. What is the best way to > >> handle the duplicate entries? I tried the find possible duplicate people > >> tool, but it didn't seem very accurate. > > > > What version of Gramps do you use? > > There was an error in 3.2.5 that missed obvious duplicates. > > This has been fixed now. > > > > /Peter > > I'm using version 3.2.3.1. When I ran the tool it found many duplicates > that were not duplicates. Some were pretty far off. Some were close. OK. Try to upgrade to 3.2.5. I can then send you the updated program, FindDupes.py, so that you can replace it yourself. /Peter |