Prevent multiple active batch spark streaming -
i running spark job stream context running each 60seconds. problem processing time of 1 batch long (due calculation , saving rdd , parquet cloud storage), 1 batch couldn't able finish within 1 minute timeframe. ends next batch keep going in , become active (status = processing). after while, have 10 active batch in processing while first 1 has been finished yet. result, slowed down , no batch able finish. there anyway strictly limit number of active batch 1 @ time.
thanks.
Comments
Post a Comment