OutOfMemory Error when fetching large set of data (BLOB)

classic Classic list List threaded Threaded
1 message Options
Reply | Threaded
Open this post in threaded view

OutOfMemory Error when fetching large set of data (BLOB)

Espoir Gahungere
Hello everyone,

I'm using DBUnit to export database data, especially tables containing large files, thus large BLOB data is about 600Mb .There are about 50 files in that table (called "Attachments"), each of which weighting a couple of Mb. I configured DBUnit in order to use ForwardOnlyResultSetTableFactory as recommended when exporting very large datasets. My JVM maximum Heap size to 1600Mb (and I shouldn't exceed that).

When I run the application, a few files   are processed (so written in database.xml using FlatXMLWriter). But soon after, I get an OutOfMemory error.
I ran the application with JProfiler and noticed that most of the bytes arrays were referenced by ResultSet instances of the underlying JDBC driver (MySQL) and I assume most of these bytes are used to store BLOB data fetched from DB.
I even configured DBUnit to fetch one one row at a time config.setProperty(DatabaseConfig.PROPERTY_FETCH_SIZE, 1); 

But the error keeps occurring. I was wondering if the FETCH_SIZE property was doing the job I would expect: the ResultSet should contain one row (e.g. Select * ... LIMIT = 1).

Can you maybe help me out of this ? 

Thanks :)


Open source business process management suite built on Java and Eclipse
Turn processes into business applications with Bonita BPM Community Edition
Quickly connect people, data, and systems into organized workflows
Winner of BOSSIE, CODIE, OW2 and Gartner awards
dbunit-user mailing list
[hidden email]