Recently , I inherited an Org with huge customization , which enqueue 100+ Batch Jobs in few scenario. Now, don’t ask me why ?
I remember, few years back, Salesforce had limit of 5 Apex Batch that can be executed at a time, but we had expectation and demand !!! Salesforce introduced Flex queue and we can have 100 Batch apex waiting to be executed, still we are not happy. After all, human is wanting animal.
After 100 Batch apex, all jobs were failing with error System.AsyncException You have exceeded the limit of 100 Jobs in the flex queue. As I explained my life previously in this post, I had to fix this issue as well.
Right way to fix it was to analyze existing code, perform code review, why do we need even customization etc.. However, time was crucial and I had to do something quickly.
Below framework was used to fix the issue
In this post, I would share my recent experience in fixing “CPU time limit” error in Batch apex and reason. When I encountered this error, initially thought that it would be easy to fix by following some of basic rules like :
- Remove unnecessary code and loops
- Efficient use of collections (Set, Hashmap or List)
- Avoid using loop within loop
- SOQL query should be indexable
- Avoid unnecessary re-initialization of variables
- Use Aggregated SOQL (as Database operations not counted in this limit, avoid arithmetic operations in Apex)
- Check how much time Workflow rules and process builders are taking
- Is there any manage package performing heavy operations in transaction
This piece of code was handed over to me from previous team, so I was not fully aware about full functionality and thought to check debug logs. To my surprise, how many times I tried to get a log, every attempt failed. I was thinking that problem could be in execute method however Batch Apex was failing with no debug logs. I tried all my tricks to get debug log with no success. Batch Apex was using Query locator and it could fetch up to 50 millions of record and therefore overlooked start method.