How to Deal With Airtable's 30-Second Script Limit in Automations
Airtable's automation scripts have a hard execution limit of 30 seconds. When a script takes longer than that, Airtable cuts it off and marks the automation run as failed.
If you are processing a large table, looping through hundreds of records, or running a complex calculation across many rows, 30 seconds goes faster than you expect.
Here are four ways to handle it, starting with the fastest fix and working toward the more structural solutions.
1. Batch Your Record Updates
This is the most common cause of scripts timing out, and fixing it is usually the quickest win.
Most people write their first Airtable script by updating records one at a time inside a loop, like this:
for (let record of records) {
await table.updateRecordAsync(record.id, { Status: "Done" });
}
The problem is that each updateRecordAsync call is a separate API request. If you have 200 records, that is 200 separate calls, each with its own network overhead. It is slow and it will almost certainly time out.
Airtable's API lets you update up to 50 records in a single call using updateRecordsAsync (plural). Rewriting the same logic with batching looks like this:
let updates = records.map((record) => ({
id: record.id,
fields: { Status: "Done" },
}));
while (updates.length > 0) {
await table.updateRecordsAsync(updates.slice(0, 50));
updates = updates.slice(50);
}
Instead of 200 API calls, you are now making 4. The script runs dramatically faster and is far less likely to hit the 30-second limit.
The same principle applies to creating and deleting records. Use createRecordsAsync and deleteRecordsAsync instead of their singular equivalents whenever you are working with more than one record.
2. Optimise the Script Logic
If you are already batching and still hitting the limit, the next place to look is the logic itself.
A few patterns that slow scripts down significantly:
Fetching records more than once. Every selectRecordsAsync call takes time. If your script calls it multiple times to get different fields, combine them into a single call with all the fields you need.
// Slow: two separate fetches
let records = await table.selectRecordsAsync({ fields: ["Name"] });
let moreRecords = await table.selectRecordsAsync({ fields: ["Status"] });
// Faster: one fetch with both fields
let records = await table.selectRecordsAsync({ fields: ["Name", "Status"] });
Nested loops over large datasets. If you have a loop inside a loop and both are iterating over large arrays, the number of operations multiplies quickly. Look for whether you can replace the inner loop with a lookup using a JavaScript Map or object, which is much faster than iterating.
Unnecessary awaits inside loops. If you have async calls inside a loop that do not depend on each other, you can run them in parallel using Promise.all rather than waiting for each one to finish before starting the next.
These changes do not always solve a timeout on their own, but combined with batching they can make a significant difference.
3. Split the Script Across Multiple Automation Steps
If the script is genuinely doing a large amount of work that cannot be compressed further, the solution is to split it into smaller pieces and chain them together inside the same automation.
The idea is simple. Instead of one script that does everything and runs for 90 seconds, you have three scripts that each do one part and each run in under 30 seconds.
The way to pass data between them is through a field in your table. The first script processes a batch of records and writes a flag value to each one (for example, setting a "Batch processed" field to "Step 1 done"). The second script triggers on that flag value and processes the next stage, writing its own flag when complete. The third script does the same.
You can also use a dedicated "processing queue" table for more complex pipelines. The first script writes records to the queue table to indicate what needs to be processed next. Subsequent automations trigger off new records appearing in that queue table.
This approach requires more upfront thinking about how to divide the work, but it handles genuinely large datasets that would never fit in a single 30-second window regardless of how optimised the code is.
4. Use the Scripting Extension for Manual Runs
If your script does not need to run automatically and you only trigger it occasionally, move it out of automations entirely and into the Scripting extension instead.
The Scripting extension runs on your local machine rather than Airtable's servers. It has no time limit. It can run for as long as it needs to, process as many records as it needs to, and make as many API calls as required.
To use it, open your base, click the Extensions icon in the toolbar, and add the Scripting extension. Paste your script there and run it manually whenever you need it.
The key limitation is that the Scripting extension cannot be triggered automatically. You cannot kick it off from a form submission, a record change, or a scheduled time. It only runs when a person with base access manually clicks Run inside the extension. If your workflow requires automation, this option does not apply.
For scripts that genuinely only need to run once in a while (bulk data migrations, monthly reports, one-off cleanup operations), the Scripting extension is a much more relaxed environment than automations.
Choosing the Right Approach
Start with batching. It fixes the majority of timeout issues and requires minimal changes to your existing script.
If batching is not enough, review the script logic for unnecessary fetches and nested loops.
If the script is genuinely too large for 30 seconds no matter what, split it across multiple steps.
If it does not need to run automatically at all, move it to the Scripting extension.
Most timeout problems are solved at step one. The chaining approach in step three is more work to set up but is the right solution when you are processing genuinely large volumes of data regularly.
For related automation problems, if you are running into Airtable's limit on the total number of automations per base rather than the script execution time, see Hitting Airtable's 50 Automation Limit? Here's What You Can Do.
If you are concerned about your monthly automation run quota, see How to Limit Automation Runs in Airtable to Avoid Running Out of Quota.