Is there a way to throttle the number of records updated in SalesForce?

asked 2020-04-27 23:52:45 -0500

oberfirer gravatar image

Hello, I'm trying to implement a pipeline that reads from a Web service, and updates records in SF. It works fine with a small number of test records- 20-30, but I'm running into an issue updating a larger dataset, which currently is around 400 records (and it will increase). The object in Salesforce that needs to be updated (Opportunity) has a large number of triggers attached to it, so it takes a while to update each record, and eventually I get a "Apex CPU time limit exceeded" error. I'm aware of a "Rate Limit" setting on the overall pipeline, but it does not work because the origin is calling a web service, and it only receives 1 record (file download link), but on subsequent stages the pipeline unravels into 400 records. Are there any other methods for throttling how many records get updated in the SF destination from a previous stage? Is there a "Batch size" that can be set for SF destination (I could not find one), or could a scripting stage upstream serve as a "loop" to iterate over 400 records with a smaller batch size like 20 or 30?

Thank you! Oleg

edit retag flag offensive close merge delete