Ask Your Question
0

Out of Memory Error on JDBC Multitable Connector for small fetch amounts?

asked 2017-10-19 10:39:21 -0600

andytroiano gravatar image

updated 2017-10-19 14:17:42 -0600

metadaddy gravatar image

I am looking to pull in a few different MySQL tables to HDFS.

I have the flow configured but I keep getting an out of memory error when I try to run the job when I look for a large table. Exact error is below. I don't see any issues when I configure it to only read in a small table with a limited number of rows. I have kept all of the advanced settings to their defaults.

com.google.common.util.concurrent.ExecutionError: java.lang.OutOfMemoryError: Java heap space

Any ideas on why it is blowing up my memory to pull in a large table? I thought fetching small amount of rows at a time would solve for this.

edit retag flag offensive close merge delete

Comments

Can you paste in the full stack trace? That should give some insight into what's happening.

metadaddy gravatar imagemetadaddy ( 2017-10-19 14:17:29 -0600 )edit

1 Answer

Sort by ยป oldest newest most voted
0

answered 2017-12-04 15:27:54 -0600

jeff gravatar image

There is currently a bug in the JDBC multitable consumer when using in conjunction with MySQL and large tables. See details here. In essence, please try setting the fetch size explicitly to -2147483648 (which is Integer.MIN_VALUE)

edit flag offensive delete link more
Login/Signup to Answer

Question Tools

1 follower

Stats

Asked: 2017-10-19 10:39:21 -0600

Seen: 71 times

Last updated: Dec 04 '17