[reportlab-users] Platypus tables with large numbers of rows

Andy Robinson andy at reportlab.com
Fri Apr 13 02:41:30 EDT 2007


On 13/04/07, Mike Kent <mrmakent at cox.net> wrote:

> A Platypus table requires that you give it a list of all of the rows in

> the table. That means that all of the data for all of the rows in the

> table has to be in memory at the same time. From experimentation, it

> appears that a table will not accept an iterator for a sequence of rows,

> which would allow me to do lazy evaluation of the rows; it must be an

> actual list.

>

> Am I right about this? Is there no way of getting a table to use lazy

> evaluation for the rows?


There wouldn't be much point because we construct the entire PDF file
in memory. This allows us to handle cross-references and other things
like "Page 2 of 128". The storage for the table cells probably isn't
the issue. If you want to find out, you could try modifying the same
report to output comma-delimited lines of text and see what happens to
the memory usage.

I would always recommend that you break down such jobs into a lot of
little tables anyway - one per row, or one per group of rows. Tables
do some work to examine several rows and auto-size columns and to
handle splitting over pages, and it may go faster if you break it
down.

We used to have a problem with computation time for very large tables,
but I believe this was fixed a while ago. Currently I find I can
usually create a book-size document in memory on most machines easily
enough, and I never met a printer who wanted more than about 500 pages
in one PDF file.

- Andy


More information about the reportlab-users mailing list