export data from a big table

classic Classic list List threaded Threaded
12 messages Options
Reply | Threaded
Open this post in threaded view
|

export data from a big table

Barnabas Davoti
Hi,

I used DBUnit with success so far ... but now I want to export big
tables (Oracle, over 2 mill rows, 21 columns - NUMBER, TIMESTAMP,
VARCHAR2, no binary data, no BLOBs).

The export fails with:
java.lang.OutOfMemoryError: GC overhead limit exceeded

I trigger this from ANT on mac. If I give the process 1GB memory (export
ANT_OPTS=-Xmx1024m) then it fails after 2 min, if I give more, 2..4GB
then it runs for 30 min, then fails the same way.

I use streaming mode:
http://dbunit.sourceforge.net/faq.html#streaming

<target name="export">
<dbunit driver="oracle.jdbc.OracleDriver"
      url="jdbc:oracle:thin:@.....:1521:..."
      userid="user"
      password="password"
      schema="schema">

<dbconfig>
<property name="http://www.dbunit.org/properties/datatypeFactory"
value="org.dbunit.ext.oracle.Oracle10DataTypeFactory"/>
<property name="http://www.dbunit.org/properties/resultSetTableFactory"
value="org.dbunit.database.ForwardOnlyResultSetTableFactory"/>
<property name="http://www.dbunit.org/properties/fetchSize" value="100"/>
<feature name="http://www.dbunit.org/features/batchedStatements"
value="true"/>
</dbconfig>

<export dest="TABLE1.xml" format="xml">
   <table name="TABLE1"/>
</export>
</dbunit>
</target>

See the error below.
It uses the ForwardOnlyResultSetTable class so this should be in
streaming mode, but the call comes from CachedResultSetTable. Confusing.

BR.

Barna


build.xml:41: java.lang.OutOfMemoryError: GC overhead limit exceeded
        at
oracle.jdbc.driver.CharCommonAccessor.getString(CharCommonAccessor.java:385)
        at
oracle.jdbc.driver.T4CVarcharAccessor.getString(T4CVarcharAccessor.java:411)
        at
oracle.jdbc.driver.OracleResultSetImpl.getString(OracleResultSetImpl.java:397)
        at
org.dbunit.dataset.datatype.StringDataType.getSqlValue(StringDataType.java:145)
        at
org.dbunit.database.ForwardOnlyResultSetTable.getValue(ForwardOnlyResultSetTable.java:104)
        at org.dbunit.dataset.DefaultTable.addTableRows(DefaultTable.java:139)
        at
org.dbunit.database.CachedResultSetTable.<init>(CachedResultSetTable.java:69)
        at
org.dbunit.database.CachedResultSetTableFactory.createTable(CachedResultSetTableFactory.java:52)
        at
org.dbunit.database.AbstractDatabaseConnection.createQueryTable(AbstractDatabaseConnection.java:92)
        at
org.dbunit.database.AbstractDatabaseConnection.createTable(AbstractDatabaseConnection.java:144)
        at
org.dbunit.database.QueryTableIterator.getTable(QueryTableIterator.java:143)
        at org.dbunit.dataset.CompositeDataSet.<init>(CompositeDataSet.java:94)
        at org.dbunit.dataset.CompositeDataSet.<init>(CompositeDataSet.java:66)
        at org.dbunit.dataset.CompositeDataSet.<init>(CompositeDataSet.java:51)
        at org.dbunit.ant.AbstractStep.getDatabaseDataSet(AbstractStep.java:111)
        at org.dbunit.ant.Export.getExportDataSet(Export.java:244)
        at org.dbunit.ant.Export.execute(Export.java:176)
        at org.dbunit.ant.DbUnitTask.execute(DbUnitTask.java:385)
        at org.apache.tools.ant.UnknownElement.execute(UnknownElement.java:293)





------------------------------------------------------------------------------
Check out the vibrant tech community on one of the world's most
engaging tech sites, SlashDot.org! http://sdm.link/slashdot
_______________________________________________
dbunit-user mailing list
[hidden email]
https://lists.sourceforge.net/lists/listinfo/dbunit-user
Reply | Threaded
Open this post in threaded view
|

Re: export data from a big table

Jeff Jensen-2
Can you profile it to see what is consuming the memory? Perhaps some type of leak to fix causes the excessive memory usage.


On Mon, Jan 16, 2017 at 1:17 PM, Barnabas Davoti <[hidden email]> wrote:
Hi,

I used DBUnit with success so far ... but now I want to export big
tables (Oracle, over 2 mill rows, 21 columns - NUMBER, TIMESTAMP,
VARCHAR2, no binary data, no BLOBs).

The export fails with:
java.lang.OutOfMemoryError: GC overhead limit exceeded

I trigger this from ANT on mac. If I give the process 1GB memory (export
ANT_OPTS=-Xmx1024m) then it fails after 2 min, if I give more, 2..4GB
then it runs for 30 min, then fails the same way.

I use streaming mode:
http://dbunit.sourceforge.net/faq.html#streaming

<target name="export">
<dbunit driver="oracle.jdbc.OracleDriver"
      url="jdbc:oracle:thin:@.....:1521:..."
      userid="user"
      password="password"
      schema="schema">

<dbconfig>
<property name="http://www.dbunit.org/properties/datatypeFactory"
value="org.dbunit.ext.oracle.Oracle10DataTypeFactory"/>
<property name="http://www.dbunit.org/properties/resultSetTableFactory"
value="org.dbunit.database.ForwardOnlyResultSetTableFactory"/>
<property name="http://www.dbunit.org/properties/fetchSize" value="100"/>
<feature name="http://www.dbunit.org/features/batchedStatements"
value="true"/>
</dbconfig>

<export dest="TABLE1.xml" format="xml">
   <table name="TABLE1"/>
</export>
</dbunit>
</target>

See the error below.
It uses the ForwardOnlyResultSetTable class so this should be in
streaming mode, but the call comes from CachedResultSetTable. Confusing.

BR.

Barna


build.xml:41: java.lang.OutOfMemoryError: GC overhead limit exceeded
        at
oracle.jdbc.driver.CharCommonAccessor.getString(CharCommonAccessor.java:385)
        at
oracle.jdbc.driver.T4CVarcharAccessor.getString(T4CVarcharAccessor.java:411)
        at
oracle.jdbc.driver.OracleResultSetImpl.getString(OracleResultSetImpl.java:397)
        at
org.dbunit.dataset.datatype.StringDataType.getSqlValue(StringDataType.java:145)
        at
org.dbunit.database.ForwardOnlyResultSetTable.getValue(ForwardOnlyResultSetTable.java:104)
        at org.dbunit.dataset.DefaultTable.addTableRows(DefaultTable.java:139)
        at
org.dbunit.database.CachedResultSetTable.<init>(CachedResultSetTable.java:69)
        at
org.dbunit.database.CachedResultSetTableFactory.createTable(CachedResultSetTableFactory.java:52)
        at
org.dbunit.database.AbstractDatabaseConnection.createQueryTable(AbstractDatabaseConnection.java:92)
        at
org.dbunit.database.AbstractDatabaseConnection.createTable(AbstractDatabaseConnection.java:144)
        at
org.dbunit.database.QueryTableIterator.getTable(QueryTableIterator.java:143)
        at org.dbunit.dataset.CompositeDataSet.<init>(CompositeDataSet.java:94)
        at org.dbunit.dataset.CompositeDataSet.<init>(CompositeDataSet.java:66)
        at org.dbunit.dataset.CompositeDataSet.<init>(CompositeDataSet.java:51)
        at org.dbunit.ant.AbstractStep.getDatabaseDataSet(AbstractStep.java:111)
        at org.dbunit.ant.Export.getExportDataSet(Export.java:244)
        at org.dbunit.ant.Export.execute(Export.java:176)
        at org.dbunit.ant.DbUnitTask.execute(DbUnitTask.java:385)
        at org.apache.tools.ant.UnknownElement.execute(UnknownElement.java:293)





------------------------------------------------------------------------------
Check out the vibrant tech community on one of the world's most
engaging tech sites, SlashDot.org! http://sdm.link/slashdot
_______________________________________________
dbunit-user mailing list
[hidden email]
https://lists.sourceforge.net/lists/listinfo/dbunit-user


------------------------------------------------------------------------------
Check out the vibrant tech community on one of the world's most
engaging tech sites, SlashDot.org! http://sdm.link/slashdot
_______________________________________________
dbunit-user mailing list
[hidden email]
https://lists.sourceforge.net/lists/listinfo/dbunit-user
Reply | Threaded
Open this post in threaded view
|

Re: export data from a big table

Barnabas Davoti
Hi Jeff,

Maybe it is just not running in streaming mode. I added this property to
the dbconfig:

<property name="http://www.dbunit.org/properties/resultSetTableFactory"

value="org.dbunit.database.ForwardOnlyResultSetTableFactory"/>

... but it seems it does not make any difference. The process fails
after 2.5 min with or without this prop.

I did profiling and it looks like it keeps a huge amount of objects in
memory.

In streaming mode I should be able to see the output file growing,
right? I can't find any output files.

BR.

Barna


On 2017-01-16 20:59, Jeff Jensen wrote:
> Can you profile it to see what is consuming the memory? Perhaps some
> type of leak to fix causes the excessive memory usage.

------------------------------------------------------------------------------
Check out the vibrant tech community on one of the world's most
engaging tech sites, SlashDot.org! http://sdm.link/slashdot
_______________________________________________
dbunit-user mailing list
[hidden email]
https://lists.sourceforge.net/lists/listinfo/dbunit-user
Reply | Threaded
Open this post in threaded view
|

Re: export data from a big table

mmistroni
In reply to this post by Barnabas Davoti
Can u export in chunks?
Kr

On 16 Jan 2017 7:42 pm, "Barnabas Davoti" <[hidden email]> wrote:
Hi,

I used DBUnit with success so far ... but now I want to export big
tables (Oracle, over 2 mill rows, 21 columns - NUMBER, TIMESTAMP,
VARCHAR2, no binary data, no BLOBs).

The export fails with:
java.lang.OutOfMemoryError: GC overhead limit exceeded

I trigger this from ANT on mac. If I give the process 1GB memory (export
ANT_OPTS=-Xmx1024m) then it fails after 2 min, if I give more, 2..4GB
then it runs for 30 min, then fails the same way.

I use streaming mode:
http://dbunit.sourceforge.net/faq.html#streaming

<target name="export">
<dbunit driver="oracle.jdbc.OracleDriver"
      url="jdbc:oracle:thin:@.....:1521:..."
      userid="user"
      password="password"
      schema="schema">

<dbconfig>
<property name="http://www.dbunit.org/properties/datatypeFactory"
value="org.dbunit.ext.oracle.Oracle10DataTypeFactory"/>
<property name="http://www.dbunit.org/properties/resultSetTableFactory"
value="org.dbunit.database.ForwardOnlyResultSetTableFactory"/>
<property name="http://www.dbunit.org/properties/fetchSize" value="100"/>
<feature name="http://www.dbunit.org/features/batchedStatements"
value="true"/>
</dbconfig>

<export dest="TABLE1.xml" format="xml">
   <table name="TABLE1"/>
</export>
</dbunit>
</target>

See the error below.
It uses the ForwardOnlyResultSetTable class so this should be in
streaming mode, but the call comes from CachedResultSetTable. Confusing.

BR.

Barna


build.xml:41: java.lang.OutOfMemoryError: GC overhead limit exceeded
        at
oracle.jdbc.driver.CharCommonAccessor.getString(CharCommonAccessor.java:385)
        at
oracle.jdbc.driver.T4CVarcharAccessor.getString(T4CVarcharAccessor.java:411)
        at
oracle.jdbc.driver.OracleResultSetImpl.getString(OracleResultSetImpl.java:397)
        at
org.dbunit.dataset.datatype.StringDataType.getSqlValue(StringDataType.java:145)
        at
org.dbunit.database.ForwardOnlyResultSetTable.getValue(ForwardOnlyResultSetTable.java:104)
        at org.dbunit.dataset.DefaultTable.addTableRows(DefaultTable.java:139)
        at
org.dbunit.database.CachedResultSetTable.<init>(CachedResultSetTable.java:69)
        at
org.dbunit.database.CachedResultSetTableFactory.createTable(CachedResultSetTableFactory.java:52)
        at
org.dbunit.database.AbstractDatabaseConnection.createQueryTable(AbstractDatabaseConnection.java:92)
        at
org.dbunit.database.AbstractDatabaseConnection.createTable(AbstractDatabaseConnection.java:144)
        at
org.dbunit.database.QueryTableIterator.getTable(QueryTableIterator.java:143)
        at org.dbunit.dataset.CompositeDataSet.<init>(CompositeDataSet.java:94)
        at org.dbunit.dataset.CompositeDataSet.<init>(CompositeDataSet.java:66)
        at org.dbunit.dataset.CompositeDataSet.<init>(CompositeDataSet.java:51)
        at org.dbunit.ant.AbstractStep.getDatabaseDataSet(AbstractStep.java:111)
        at org.dbunit.ant.Export.getExportDataSet(Export.java:244)
        at org.dbunit.ant.Export.execute(Export.java:176)
        at org.dbunit.ant.DbUnitTask.execute(DbUnitTask.java:385)
        at org.apache.tools.ant.UnknownElement.execute(UnknownElement.java:293)





------------------------------------------------------------------------------
Check out the vibrant tech community on one of the world's most
engaging tech sites, SlashDot.org! http://sdm.link/slashdot
_______________________________________________
dbunit-user mailing list
[hidden email]
https://lists.sourceforge.net/lists/listinfo/dbunit-user

------------------------------------------------------------------------------
Check out the vibrant tech community on one of the world's most
engaging tech sites, SlashDot.org! http://sdm.link/slashdot
_______________________________________________
dbunit-user mailing list
[hidden email]
https://lists.sourceforge.net/lists/listinfo/dbunit-user
Reply | Threaded
Open this post in threaded view
|

Re: export data from a big table

Tom Chiverton

Or, if double the RAM gives ~x10 the run time, just give it even more. On a 64bit O/S you could give it ten times the RAM easily ?

Tom


On 17/01/17 09:59, Marco Mistroni wrote:
Can u export in chunks?
Kr

On 16 Jan 2017 7:42 pm, "Barnabas Davoti" <[hidden email]> wrote:
Hi,

I used DBUnit with success so far ... but now I want to export big
tables (Oracle, over 2 mill rows, 21 columns - NUMBER, TIMESTAMP,
VARCHAR2, no binary data, no BLOBs).

The export fails with:
java.lang.OutOfMemoryError: GC overhead limit exceeded

I trigger this from ANT on mac. If I give the process 1GB memory (export
ANT_OPTS=-Xmx1024m) then it fails after 2 min, if I give more, 2..4GB
then it runs for 30 min, then fails the same way.

I use streaming mode:
http://dbunit.sourceforge.net/faq.html#streaming

<target name="export">
<dbunit driver="oracle.jdbc.OracleDriver"
      url="jdbc:oracle:thin:@.....:1521:..."
      userid="user"
      password="password"
      schema="schema">

<dbconfig>
<property name="http://www.dbunit.org/properties/datatypeFactory"
value="org.dbunit.ext.oracle.Oracle10DataTypeFactory"/>
<property name="http://www.dbunit.org/properties/resultSetTableFactory"
value="org.dbunit.database.ForwardOnlyResultSetTableFactory"/>
<property name="http://www.dbunit.org/properties/fetchSize" value="100"/>
<feature name="http://www.dbunit.org/features/batchedStatements"
value="true"/>
</dbconfig>

<export dest="TABLE1.xml" format="xml">
   <table name="TABLE1"/>
</export>
</dbunit>
</target>

See the error below.
It uses the ForwardOnlyResultSetTable class so this should be in
streaming mode, but the call comes from CachedResultSetTable. Confusing.

BR.

Barna


build.xml:41: java.lang.OutOfMemoryError: GC overhead limit exceeded
        at
oracle.jdbc.driver.CharCommonAccessor.getString(CharCommonAccessor.java:385)
        at
oracle.jdbc.driver.T4CVarcharAccessor.getString(T4CVarcharAccessor.java:411)
        at
oracle.jdbc.driver.OracleResultSetImpl.getString(OracleResultSetImpl.java:397)
        at
org.dbunit.dataset.datatype.StringDataType.getSqlValue(StringDataType.java:145)
        at
org.dbunit.database.ForwardOnlyResultSetTable.getValue(ForwardOnlyResultSetTable.java:104)
        at org.dbunit.dataset.DefaultTable.addTableRows(DefaultTable.java:139)
        at
org.dbunit.database.CachedResultSetTable.<init>(CachedResultSetTable.java:69)
        at
org.dbunit.database.CachedResultSetTableFactory.createTable(CachedResultSetTableFactory.java:52)
        at
org.dbunit.database.AbstractDatabaseConnection.createQueryTable(AbstractDatabaseConnection.java:92)
        at
org.dbunit.database.AbstractDatabaseConnection.createTable(AbstractDatabaseConnection.java:144)
        at
org.dbunit.database.QueryTableIterator.getTable(QueryTableIterator.java:143)
        at org.dbunit.dataset.CompositeDataSet.<init>(CompositeDataSet.java:94)
        at org.dbunit.dataset.CompositeDataSet.<init>(CompositeDataSet.java:66)
        at org.dbunit.dataset.CompositeDataSet.<init>(CompositeDataSet.java:51)
        at org.dbunit.ant.AbstractStep.getDatabaseDataSet(AbstractStep.java:111)
        at org.dbunit.ant.Export.getExportDataSet(Export.java:244)
        at org.dbunit.ant.Export.execute(Export.java:176)
        at org.dbunit.ant.DbUnitTask.execute(DbUnitTask.java:385)
        at org.apache.tools.ant.UnknownElement.execute(UnknownElement.java:293)





------------------------------------------------------------------------------
Check out the vibrant tech community on one of the world's most
engaging tech sites, SlashDot.org! http://sdm.link/slashdot
_______________________________________________
dbunit-user mailing list
[hidden email]
https://lists.sourceforge.net/lists/listinfo/dbunit-user

______________________________________________________________________
This email has been scanned by the Symantec Email Security.cloud service.
For more information please visit http://www.symanteccloud.com
______________________________________________________________________


------------------------------------------------------------------------------
Check out the vibrant tech community on one of the world's most
engaging tech sites, SlashDot.org! http://sdm.link/slashdot


_______________________________________________
dbunit-user mailing list
[hidden email]
https://lists.sourceforge.net/lists/listinfo/dbunit-user


------------------------------------------------------------------------------
Check out the vibrant tech community on one of the world's most
engaging tech sites, SlashDot.org! http://sdm.link/slashdot
_______________________________________________
dbunit-user mailing list
[hidden email]
https://lists.sourceforge.net/lists/listinfo/dbunit-user
Reply | Threaded
Open this post in threaded view
|

Re: export data from a big table

Barnabas Davoti
In reply to this post by mmistroni
On 2017-01-17 10:59, Marco Mistroni wrote:
> Can u export in chunks?

Yes, I can exports chunks from the big table doing like:
<export dest="TABLE1000.xml">
<query name="idmap" sql="
        SELECT *
        FROM TABLE
        WHERE ROWNUM &lt; 1000
"/>
</export>

That runs in "0" seconds. Quick.

... or I can also export full smaller tables, for ex. I got a 600 MB XML
quite quickly.

The big table has 2.5 mill. records. One record is not so big, no BLOBs.
Should DBUnit work with this?

On 2017-01-17 11:03, Tom Chiverton wrote:
 > Or, if double the RAM gives ~x10 the run time, just give it even more.
 > On a 64bit O/S you could give it ten times the RAM easily ?

Yes, if I double the memory, it'll run much longer, but then it still
fails with memory issue.

I work on a general extraction proof of concept should work with many
different databases, so I don't want to increase the memory and see if
that helps with this specific case.

Instead I want to make sure I use the streaming mode of DBUnit. How can
I check that? That's the *main question*.

Btw. I use the latest DBUnit version.

BR.

Barna

------------------------------------------------------------------------------
Check out the vibrant tech community on one of the world's most
engaging tech sites, SlashDot.org! http://sdm.link/slashdot
_______________________________________________
dbunit-user mailing list
[hidden email]
https://lists.sourceforge.net/lists/listinfo/dbunit-user
Reply | Threaded
Open this post in threaded view
|

Re: export data from a big table

Jeff Jensen-2
In reply to this post by Barnabas Davoti
>I did profiling and it looks like it keeps a huge amount of objects in memory.

If you can analyze these objects and chains, perhaps we can find the reason they aren't garbage collected.  Perhaps there is a bug fix to make for this.


On Tue, Jan 17, 2017 at 3:28 AM, Barnabas Davoti <[hidden email]> wrote:
Hi Jeff,

Maybe it is just not running in streaming mode. I added this property to
the dbconfig:

<property name="http://www.dbunit.org/properties/resultSetTableFactory"

value="org.dbunit.database.ForwardOnlyResultSetTableFactory"/>

... but it seems it does not make any difference. The process fails
after 2.5 min with or without this prop.

I did profiling and it looks like it keeps a huge amount of objects in
memory.

In streaming mode I should be able to see the output file growing,
right? I can't find any output files.

BR.

Barna


On 2017-01-16 20:59, Jeff Jensen wrote:
> Can you profile it to see what is consuming the memory? Perhaps some
> type of leak to fix causes the excessive memory usage.

------------------------------------------------------------------------------
Check out the vibrant tech community on one of the world's most
engaging tech sites, SlashDot.org! http://sdm.link/slashdot
_______________________________________________
dbunit-user mailing list
[hidden email]
https://lists.sourceforge.net/lists/listinfo/dbunit-user


------------------------------------------------------------------------------
Check out the vibrant tech community on one of the world's most
engaging tech sites, SlashDot.org! http://sdm.link/slashdot
_______________________________________________
dbunit-user mailing list
[hidden email]
https://lists.sourceforge.net/lists/listinfo/dbunit-user
Reply | Threaded
Open this post in threaded view
|

Re: export data from a big table

Jeff Jensen-2
In reply to this post by Barnabas Davoti
> I added this property to the dbconfig: ... but it seems it does not make any difference. The process fails after 2.5 min with or without this prop.

Perhaps setting it via the Ant task does not work correctly(?).  I suggest reviewing that code to ensure it instantiates the class represented by the String and does not just leave it as a String.  The FAQ http://dbunit.sourceforge.net/faq.html#streaming shows setting the stream mode using code, and need to ensure the Ant task does the same thing: 
config.setProperty(DatabaseConfig.PROPERTY_RESULTSET_TABLE_FACTORY, new ForwardOnlyResultSetTableFactory());


On Tue, Jan 17, 2017 at 3:28 AM, Barnabas Davoti <[hidden email]> wrote:
Hi Jeff,

Maybe it is just not running in streaming mode. I added this property to
the dbconfig:

<property name="http://www.dbunit.org/properties/resultSetTableFactory"

value="org.dbunit.database.ForwardOnlyResultSetTableFactory"/>

... but it seems it does not make any difference. The process fails
after 2.5 min with or without this prop.

I did profiling and it looks like it keeps a huge amount of objects in
memory.

In streaming mode I should be able to see the output file growing,
right? I can't find any output files.

BR.

Barna


On 2017-01-16 20:59, Jeff Jensen wrote:
> Can you profile it to see what is consuming the memory? Perhaps some
> type of leak to fix causes the excessive memory usage.

------------------------------------------------------------------------------
Check out the vibrant tech community on one of the world's most
engaging tech sites, SlashDot.org! http://sdm.link/slashdot
_______________________________________________
dbunit-user mailing list
[hidden email]
https://lists.sourceforge.net/lists/listinfo/dbunit-user


------------------------------------------------------------------------------
Check out the vibrant tech community on one of the world's most
engaging tech sites, SlashDot.org! http://sdm.link/slashdot
_______________________________________________
dbunit-user mailing list
[hidden email]
https://lists.sourceforge.net/lists/listinfo/dbunit-user
Reply | Threaded
Open this post in threaded view
|

Re: export data from a big table

Barnabas Davoti
Hi Jeff,

I run the export using the Java API instead of the DBUnit Ant task. That
worked perfectly. Then the output file appeared immediately and I saw it
growing during the process. Low memory usage and pretty fast. I'm happy.

I also suspected the Ant task implementation, looked through the code,
but haven't found any bug.

The Ant dbconfig tag sets the property by string:
https://sourceforge.net/p/dbunit/code.git/ci/master/tree/dbunit/src/main/java/org/dbunit/ant/DbConfig.java#l108

... which calls
https://sourceforge.net/p/dbunit/code.git/ci/master/tree/dbunit/src/main/java/org/dbunit/database/DatabaseConfig.java#l346

The property type is a class and it gets instantiated via an if/if
else/else statement where the else branch does use reflection.

According to the debug log, the props are setup:
[main] DEBUG org.dbunit.ant.Export -
execute(connection=org.dbunit.database.DatabaseConnection[schema=FK,
connection=oracle.jdbc.driver.T4CConnection@7d70d1b1,
super=_databaseConfig=org.dbunit.database.DatabaseConfig[,
_propertyMap={http://www.dbunit.org/features/qualifiedTableNames=false,
http://www.dbunit.org/properties/tableType=[Ljava.lang.String;@4b553d26,
http://www.dbunit.org/properties/batchSize=100,
http://www.dbunit.org/properties/statementFactory=org.dbunit.database.statement.PreparedStatementFactory@69a3d1d,
http://www.dbunit.org/features/batchedStatements=false,
http://www.dbunit.org/features/caseSensitiveTableNames=false,
http://www.dbunit.org/features/allowEmptyFields=false,
http://www.dbunit.org/properties/datatypeFactory=org.dbunit.ext.oracle.Oracle10DataTypeFactory[_toleratedDeltaMap=org.dbunit.dataset.datatype.ToleratedDeltaMap@86be70a],
http://www.dbunit.org/properties/metadataHandler=org.dbunit.database.DefaultMetadataHandler@480bdb19,
http://www.dbunit.org/properties/fetchSize=100,
http://www.dbunit.org/properties/resultSetTableFactory=org.dbunit.database.ForwardOnlyResultSetTableFactory@2a556333,
http://www.dbunit.org/features/datatypeWarning=true,
http://www.dbunit.org/properties/escapePattern=null}], _dataSet=null]) -
start


This is important from the chunk above:
http://www.dbunit.org/properties/statementFactory=org.dbunit.database.statement.PreparedStatementFactory@69a3d1d

... should mean that it did manage to use reflection on
PreparedStatementFactory.

So, sorry, I still don't know what's wrong with the Ant triggering...
but at least I have a good workaround.

BR.

Barna


On 2017-01-18 04:18, Jeff Jensen wrote:

>> I added this property to the dbconfig: ... but it seems it does not
> make any difference. The process fails after 2.5 min with or without
> this prop.
>
> Perhaps setting it via the Ant task does not work correctly(?).  I
> suggest reviewing that code to ensure it instantiates the class
> represented by the String and does not just leave it as a String.  The
> FAQ http://dbunit.sourceforge.net/faq.html#streaming shows setting the
> stream mode using code, and need to ensure the Ant task does the same
> thing:
> config.setProperty(DatabaseConfig.PROPERTY_RESULTSET_TABLE_FACTORY, new
> ForwardOnlyResultSetTableFactory());

------------------------------------------------------------------------------
Check out the vibrant tech community on one of the world's most
engaging tech sites, SlashDot.org! http://sdm.link/slashdot
_______________________________________________
dbunit-user mailing list
[hidden email]
https://lists.sourceforge.net/lists/listinfo/dbunit-user
Reply | Threaded
Open this post in threaded view
|

Re: export data from a big table

Barnabas Davoti
Hi Jeff,

My coworker found something.

getDatabaseDataSet() is called with forwardonly=false here:
https://sourceforge.net/p/dbunit/code.git/ci/master/tree/dbunit/src/main/java/org/dbunit/ant/Export.java#l244

then:
https://sourceforge.net/p/dbunit/code.git/ci/master/tree/dbunit/src/main/java/org/dbunit/ant/AbstractStep.java#l91

BR.

Barna



------------------------------------------------------------------------------
Check out the vibrant tech community on one of the world's most
engaging tech sites, SlashDot.org! http://sdm.link/slashdot
_______________________________________________
dbunit-user mailing list
[hidden email]
https://lists.sourceforge.net/lists/listinfo/dbunit-user
Reply | Threaded
Open this post in threaded view
|

Re: export data from a big table

Jeff Jensen-2
Excellent find!  Thank you for tracking it down.

Would you mind creating a defect report [0]?  If you can create a pull request or attach a patch, that's best because then I can apply it asap.  Would be great if you found tests to correct or add that proves this was broken and then fixed with the change.

Thank you again!



On Wed, Jan 18, 2017 at 9:52 AM, Barnabas Davoti <[hidden email]> wrote:
Hi Jeff,

My coworker found something.

getDatabaseDataSet() is called with forwardonly=false here:
https://sourceforge.net/p/dbunit/code.git/ci/master/tree/dbunit/src/main/java/org/dbunit/ant/Export.java#l244

then:
https://sourceforge.net/p/dbunit/code.git/ci/master/tree/dbunit/src/main/java/org/dbunit/ant/AbstractStep.java#l91

BR.

Barna



------------------------------------------------------------------------------
Check out the vibrant tech community on one of the world's most
engaging tech sites, SlashDot.org! http://sdm.link/slashdot
_______________________________________________
dbunit-user mailing list
[hidden email]
https://lists.sourceforge.net/lists/listinfo/dbunit-user


------------------------------------------------------------------------------
Check out the vibrant tech community on one of the world's most
engaging tech sites, SlashDot.org! http://sdm.link/slashdot
_______________________________________________
dbunit-user mailing list
[hidden email]
https://lists.sourceforge.net/lists/listinfo/dbunit-user
Reply | Threaded
Open this post in threaded view
|

Re: export data from a big table

Barnabas Davoti
Hi Jeff,

On 2017-01-18 17:51, Jeff Jensen wrote:
> Would you mind creating a defect report [0]?

I've filed one.

> If you can create a pull
> request or attach a patch, that's best because then I can apply it
> asap.  Would be great if you found tests to correct or add that proves
> this was broken and then fixed with the change.

We'll fix the code, add unit test and make a pull request.

BR.

Barna


------------------------------------------------------------------------------
Check out the vibrant tech community on one of the world's most
engaging tech sites, SlashDot.org! http://sdm.link/slashdot
_______________________________________________
dbunit-user mailing list
[hidden email]
https://lists.sourceforge.net/lists/listinfo/dbunit-user