I couldn’t find a plug-in to do it for me, so have taken a different approach – clean up the GEDCOM file before presenting it to FH. I have dabbled with using Microsoft Excel macros to produce customised reports from raw GEDCOM files, so that seemed like a possible approach here. I couldn’t find much (any?) discussion of this on FHUG Forums, but it was worth a try.
After a few late nights (!) I now have a prototype that seems to work as planned. While doing this, I have also taken the opportunity to tidy up some other differences in GEDCOM format between FTM and FH that were cluttering up the import log.
- Moved all top-level data from the census records to NOTE tags (“1 CENS data record” to “2 NOTE data record”) as well as removing duplicated sources and changing citations accordingly.
Ditto for other facts, such as death and burial. There is an issue with how this is handled for facts that already have notes recorded, but left to its own devices FH doesn’t seem to do a particularly neat job anyway (the new and old notes are both there, but not readily visible together).
Converted custom _FOOT tags to NOTEs, as these were causing duplicated information to appear in report source citations.
Converted FTM’s @Mnnn@ multimedia tags to FH’s @Onnn@, changed the tags to those supported by FH, and corrected invalid GEDCOM date/time data.
Only later versions of FTM appended multimedia captions automatically, so my reports were a mix of captioned and uncaptioned images, which looked horrible. My file names were fairly structured (e.g. “Baptism – 1835 – John Smith – Chelsea MDX.jpg”) so I used the filename without the extension as the default caption to match what FTM was doing.
Before going too far, I’d like to ask whether anybody else has experience of a similar approach. Any traps for the unwary that I’m about to fall into (apart of course from carefully auditing both the macro code and the output data to ensure that it is really doing what I think it is doing!)?
Thanks. Looking forward to getting more familiar with FH!