Search for content in message boards

slowness of FTM 2014 on large reports

slowness of FTM 2014 on large reports

Posted: 17 Oct 2013 12:36PM GMT
Classification: Query
Edited: 17 Oct 2013 2:36PM GMT
Still Testing FTM 2014

Using XP Professional, 32 Bit, 3.5 G of Ram

Have been reluctant to test large reports, because FTM 2012 was very slow and some reports never ran to completion

Just ran a Register report for 8 generations with source attributes. The report was very heavily loaded with data and no images

Took over 45 minutes to run

Did run to completition

It appears the system is still very slow and appears that is not any faster on these large reports than was FTM 2012

I was going to test it on a fully loaded with data 36 generation Ahnantafel report ---
but don't think I will do it now

Has anyone else had any experience with large reports on FTM 2014 on 32 bit or 64 bit machines?




Re: slowness of FTM 2014 on large reports

Posted: 18 Oct 2013 12:51AM GMT
Classification: Query
Edited: 18 Oct 2013 3:47AM GMT
From my testing, FTM 2014 (64-bit) runs about 50% faster than FTM 2014 (32-bit) on the same file and the same computer.

I use a 27" iMac 64-bit with Windows 7 Pro (64-bit) running under Parallels 7 with 16 GB of RAM on each side so I can run very interesting comparisons.

The early betas of FTM 2014 were 32-bit, so I was able to run the comparisons on the same test file and same computer and record the results on a genealogy report, a 3x3 EFT and large kinship report.

FTM 2014 32-bit is no faster than FTM 2012 32-bit

In my case, I am using 16 GB of RAM, 3.2GHz i5 processor and a conventional hard drive. It is faster again with a solid state drive.

File size around 7,500 people

Note that running 64-bit and less than 4GB of RAM won't show any real improvement.

32-bit can only address 4GB of RAM and typically in Win XP, Win 7 etc, 32-bit; about 1GB of RAM is needed to run Windows and other functions, so on your set-up you only have about 2.5GB of RAM to run FTM. I think this is far too low for a 'large file'.

64-bit can comfortably use 100GB of RAM or more.

Makes a big difference to databases like FTM


Also FTM (Win) is very Windows dependent, ie it uses a lot of Windows functions via .NET. If your Windows is a bit 'suss' things will be even slower.

FTM Mac3 (in beta) 64-bit really flies compared with the Windows 2014 64-bit equivalent as it doesn't have to deal with .NET.

About 50% or more faster again than 2014 64-bit and I can get it to load in 2/3 seconds.

So maybe consider upgrading to a 64-bit Win 8.1 computer with at least 8GB of RAM and a fast processor like an i7 and also a SSD or combo SSD and conventional drive.

Or maybe just buy a good iMac? :-)



John D

Re: slowness of FTM 2014 on large reports

Posted: 18 Oct 2013 2:08AM GMT
Classification: Query
Unfortumnately, FTM 2014 64-bit crashes during a file merge after using only 3.5 to 4 GB RAM (on an i7 machine with 16GB RAM installed). The 64-bit version is better than 2012 though in its use of RAM.


Regards,

Re: slowness of FTM 2014 on large reports

Posted: 18 Oct 2013 9:57PM GMT
Classification: Query
johndd189 wrote:
"32-bit can only address 4GB of RAM and typically in Win XP, Win 7 etc, 32-bit; about 1GB of RAM is needed to run Windows and other functions, so on your set-up you only have about 2.5GB of RAM to run FTM. I think this is far too low for a 'large file'."

That's crazy. I run instances of MSSQL and MySQL with much larger data sets (100s of tables with 100,000s of rows) and these servers don't eat up memory like that when running queries and functions. Even my video games don't need resources like that. I don't understand why FTM cannot be optimized to handle this data more efficiently.

Re: slowness of FTM 2014 on large reports

Posted: 18 Oct 2013 10:00PM GMT
Classification: Query
FTM is doing too much in RAM (speed reasons?) I've programmed in Fox Pro and the 1994 version could handle datasets much larger than FTM can handle. TMG (The Master Genealogist) is based on FoxPro.

Regards,

Re: slowness of FTM 2014 on large reports

Posted: 19 Oct 2013 2:42AM GMT
Classification: Query
Crazy or not that it how it is

It works best in 64-bit with around 8GB of RAM and fast processor

It works far better as a Mac version

I don't work for Ancestry.com, these are purely my experience


John D

Re: slowness of FTM 2014 on large reports

Posted: 19 Oct 2013 7:57AM GMT
Classification: Query
Edited: 19 Oct 2013 8:03AM GMT
I think Johndd and Marco are talking about two different things

I think John is talking about what FTM 2014 presently will do [if you upgrade your computer to all the features he describes, but if you don;t upgrade it will still be deficient]

I think Marco is talking about what it should be able to do [without all the upgrades], but can't because of "the way it was programmed".

I understand both

I wish it were like what Marco is saying, but unfortunately I think it is still deficient
I think they have made improvements since FTM 2008, but there seems to be a lot of things they still could do

I will probably have to upgrade to a new 64 bit computer, with Windows 8.1, with lots more memory and much faster/better processors.

I am thinking I will also add some solid state memory either 256 GB or 400+ (?) GB

My understanding is that the solid state memory will significantly help decrease the load time for FTM 2014, but from what I have read it won't help the speed of processing once it is loaded. I am not sure whether it is true or not that it won't help the speed of processing once it is loaded. Does anybody know for sure (please no guessing)? I would think if I got 400+ GB of Solid Sate memory it should also help speed up the processing time

Re: slowness of FTM 2014 on large reports

Posted: 19 Oct 2013 8:51AM GMT
Classification: Query
FTM is a databse

Whilst a small database like an Excel file can be fully loaded from the hard drive into memory, large databases like FTM and MS Access etc may well be larger than the amount of memory available.

So sometimes when running a large databases, the program occaionally has to load information from the hard drive and save other data whilst operating.

Now conventional platter hard drives, good as they are, are many times slower than a sold state drives (SSD) and RAM.

It follows therefore that a SSD can improve the speed of overall operation where the program is swapping material in and out of the hard drive.

I understand that now you can also get so called combo drives with a mixture of an SSD and a conventional hard drive. I am sure there are lots of refernces in Googl if you are interested.

The logic is such that data needed for a particular program is moved from the conventional hard drive to the SSD as the SSD is often much smaller than a regular hard dive.

I guess one day very soon regular hard drives will disappear and we will use large SSDs??

So yes a SSD can improve overall speed especially with very large files with lots of media etc.

But RAM is much more critical, followed by 64-bit, followed by a fast processor.

And the Mac version of FTM is much faster than the Win version so your choice of OS is also part of the route to improved performance.

John D

Re: slowness of FTM 2014 on large reports

Posted: 19 Oct 2013 2:44PM GMT
Classification: Query
Edited: 19 Oct 2013 3:41PM GMT
KathyMarieAnn sized it up correctly. I'm saying that FTM can and should be optimized so that its user base doesn't require better-than-gaming-rigs just to handle trees that contain thousands of people. FTM is dealing with text in tables. It isn't rendering 3D animation.

Access is a baby database. Once rows in its tables approach 100k its performance turns to poo. Most businesses in this situation switch to a real database at that point. If anything, they rewrite their database-driven applications to handle the data piecemeal so that performance remains acceptable.

FTM is also database-driven application. I don't know what database it uses. It doesn't really matter. The point is that it's highly inefficient and does not scale. In fact, I don't believe that scalability even factors into its development. It's hard for me to imagine that it's even tested with anything other than small data sets, because how else would you explain its performance version after version?

I appreciate what you're saying here, John. Your observations are helpful to the community to help people get a handle on dealing with the shortcomings of the software. I'm just saying that someone on that development team should write some functions that enable FTM to create reports without requiring gigs of RAM.

Re: slowness of FTM 2014 on large reports

Posted: 19 Oct 2013 4:11PM GMT
Classification: Query
One more try in understanding speed and solid state drive

Let us suppose this

1- I don’t want to buy a Mac
2- I want to upgrade to a new 64 bit, windows 8.1 PC
3- Assume I can buy a PC with 1TB of solid state drive
4- Assume I can buy a PC with 1TB of additional memory
5- Assume I buy a PC with fastest processors available on the market
6- Assume I buy a hard disk with 1TB storage capacity
7- Assume the only program I want to run on this PC is FTM 2014
8- Assume I have no images loaded on the PC and I won’t be linking to anything on the PC except what is included/entered into the FTM 2014 data base
9-Assume I won’t be linking to any images/documents or anything on the internet

Now I am quite sure the program will load faster than it does on my XP 32 bit machine

I am also sure the program once loaded will run faster than it did when I ran it on my XP 32 bit machine

But here is my question/comment/statement

Since I have 1TB of solid state disk it appears to me that it won’t be swapping out to either the 1TB of additional memory, nor will it be swapping out to the 1TB of Hard disk*. Hence, it will be running as fast as it can on the processors I purchased. [* it won’t be swapping out because it has no need to since it won’t be using all the capacity of the solid state drive (Note this is an assumption on my part)]

So, if the above is true I should be buying the largest solid state disk I can buy to limit/stop the PC from swapping out to the additional memory or even worse swapping out to the rotating disk.

And if I can’t buy a large enough solid state drive I should be buying as much additional memory as I can so it doesn’t swap out to the moving disk

And if I can’t buy a large enough solid state drive and it has to swap out to the additional memory then the processing speed will DECREASE once it has to start swapping to the additional memory [And even worse if it has to start swapping to the rotating disk the speed will further DECREASE]

The question is: Is everything I say in the question/comment/statement true?





per page

Find a board about a specific topic

  • Visit our other sites:

© 1997-2014 Ancestry.com | Corporate Information | Privacy | Terms and Conditions