Recently, I have had a number of discussions regarding data loads and queries with any MySQL architected database using a JDBC connection. It seems that there are specific issues with memory during queries and batch inserts that use JDBC drivers which can be addressed. As you might expect, there are various articles already discussing related topics such as Slow JDBC batch insert (insert only applies to IEE in the case of Infobright) in which blogs have recommended using a specific driver like the Drizzle JDBC driver, which provides better performance if you have no alternative to using a JDBC driver. If you do have flexibility, using the Infobright loader (ICE or IEE), MySQL loader or DLP (IEE) or a third party ETL tool will result in much faster load and query speeds versus JDBC. So in the end, using a JDBC driver is highly discouraged.
The main options for avoiding memory contention between the database and JDBC connections are to either stream the results to the application rather than buffer, or isolate the java service onto a separate VM. If you end up having a large result set from a query, and this must fit into memory, you will obviously run into problems. However, simply moving the Java service to another server has worked consistently so make sure to do that.
For your convenience I have also included some additional information on managing large result sets using Connector/J.
MySQL API documentation for Connector/J
MySQL JDBC Streaming results example
DataDirect JDBC Driver (potential alternative to MySQL)
MySQL Presentation on JDBC Performance
Known Bugs with Connector/J
I hope this helps you in determining potential issues in your environment, as it's good to check all possible sources to a problem versus just the one that appears obvious. Good luck and please let us know if you have any other comments that will help everyone.