[reportlab-users] IOError: Too many open files

Marius Gedminas marius at gedmin.as
Thu Nov 21 02:40:39 EST 2013


On Thu, Nov 21, 2013 at 09:27:19AM +0200, Marius Gedminas wrote:

> On Wed, Nov 20, 2013 at 05:01:48PM -0600, Mike Driscoll wrote:

> > I have a script where I call my report_maker.py script that uses Reportlab,

> > In said script, I open several JPGs, which are logos that go on the report.

> > This report script is called for each payment in a payment file. We

> > recently ran into an issue where we have over 650 payments and when we hit

> > payment 507 or 508, we get the following traceback:

> >

> >

> > Traceback (most recent call last):

> > File "Phaze03_local.py", line 512, in main

> > File "/home/somebody/report_maker.py", line 836, in CLVCP_clear

> > File "/home/somebody/report_maker.py", line 598, in createDocument

> > File

> > "/usr/lib64/python2.6/site-packages/reportlab/platypus/flowables.py", line

> > 329, in __init__

> > File "/usr/lib64/python2.6/site-packages/reportlab/lib/utils.py", line

> > 452, in open_for_read

> >

> > The section of code in question appears to be:

> >

> > img = utils.ImageReader(img_path)

> >

> > However, I tried doing an "img.fp.close()" that appears to help, but then I

> > also use Image from platypus that also has this issue.

> >

> > I tried changing tactics by reading the file into StringIO and passing

> > Image that, but that just moves the issue to where I read the file. For

> > those curious, I do this:

> >

> > with open(self.logo_path, "rb") as logo_fh:

> > self.logo_path = StringIO( logo_fh.read() )

> >

> > Anyway, now I get the same basic error, but at this line instead.

>

> This is the line that hits the limit, but not necessarily the line that

> leaks open file descriptors.

>

> Can you wrap the code in a try/except: import pdb; pdb.post_mortem() and

> then check what files are open by running 'lsof -p $pid' or

> 'ls -l /proc/$pid/fd' in a shell?


Also, objgraph might be helpful to identify what is keeping Python file
object instances in memory (the file descriptors would get closed if
these were garbage-collected): https://pypi.python.org/pypi/objgraph

Something like https://mg.pov.lt/objgraph/#memory-leak-example, i.e.
find a random open file, then see what keeps it in memory.

Unfortunately Python's garbage collector doesn't track file objects, so
objgraph.by_type('file') will not work[1], but you can do

all_file_instances = [r for o in gc.get_objects()
for r in gc.get_referents(o) ir type(r) is file]

and then do objgraph.show_backrefs() on a few of them to see if the
pattern becomes apparent.

[1] filed as https://github.com/mgedmin/objgraph/issues/2

Marius Gedminas
--
When in trouble or in doubt,
run in circles, scream and shout.
-------------- next part --------------
A non-text attachment was scrubbed...
Name: not available
Type: application/pgp-signature
Size: 190 bytes
Desc: Digital signature
Url : <http://two.pairlist.net/pipermail/reportlab-users/attachments/20131121/20c535e3/attachment.pgp>


More information about the reportlab-users mailing list