Troubleshooting “Execution Timeout” in Long-Running Apps Scripts
Your Apps Script processes 50,000 rows applying conditional logic. At row 28,000, the script halts with “Exceeded maximum execution time.” Apps Script enforces a 6-minute hard limit on regular executions (Workspace Personal accounts get 6 minutes; Business accounts may differ). For long-running tasks, you cannot just optimize within one execution — you need a multi-pass pattern.
Before You Start: The 60-Second Diagnostic
Three checks:
- Confirm time limit: Workspace personal accounts: 6 minutes; business: varies; some Workspace plans: 30 minutes.
- Identify bottleneck: Add
console.log(Date.now())at the start and end. The total runtime reveals which operations are slow. - Estimate full-run time: If your processed-row time is 0.05s × 50,000 = 2,500 seconds, you need ~7 passes of 6 minutes each.
Step-by-Step Solution
H2: Batch Read/Write Operations
The single biggest performance improvement: replace per-cell getValue/setValue with bulk operations.
Slow (10x slower):
const sheet = SpreadsheetApp.getActiveSheet();
for (let i = 2; i <= 50000; i++) {
const val = sheet.getRange(i, 1).getValue();
sheet.getRange(i, 2).setValue(val * 2);
}
Fast:
const sheet = SpreadsheetApp.getActiveSheet();
const range = sheet.getRange(2, 1, 50000, 2);
const data = range.getValues();
for (let i = 0; i < data.length; i++) {
data[i][1] = data[i][0] * 2;
}
range.setValues(data);
Two API calls (one read, one write) instead of 100,000 calls. Can reduce 6-minute timeouts to seconds.
H2: Implement Resume-After-Timeout Pattern
For tasks that genuinely need multiple passes:
function processInBatches() {
const props = PropertiesService.getScriptProperties();
const startIndex = parseInt(props.getProperty('progressIndex') || '0');
const sheet = SpreadsheetApp.getActiveSheet();
const totalRows = sheet.getLastRow() - 1; // Exclude header
const batchSize = 1000;
const endIndex = Math.min(startIndex + batchSize, totalRows);
const range = sheet.getRange(startIndex + 2, 1, endIndex - startIndex, 5);
const data = range.getValues();
// Process data
for (let i = 0; i < data.length; i++) {
data[i][4] = doProcessing(data[i]); // Custom logic
}
range.setValues(data);
// Save progress
if (endIndex < totalRows) {
props.setProperty('progressIndex', endIndex.toString());
// Trigger next batch automatically
ScriptApp.newTrigger('processInBatches')
.timeBased()
.after(1000) // 1 second later
.create();
} else {
// Done — clean up
props.deleteProperty('progressIndex');
console.log('Processing complete: ' + totalRows + ' rows');
}
}
The script processes one batch, saves progress, and schedules itself for the next batch. After many invocations, the entire dataset is processed.
H2: Use Time-Based Triggers for Background Work
For recurring long tasks, schedule them as time-driven triggers:
- Apps Script editor → Triggers → Add Trigger.
- Function: your batch processor.
- Event type: Time-driven.
- Type: Every hour (or appropriate frequency).
- Save.
Each run handles whatever work fits in 6 minutes. Over a day, the trigger fires 24 times, processing ~144 batches.
H2: Avoid Common Performance Killers
- Loops calling sheet methods: each
sheet.getRange()orsheet.getDataRange()is expensive. Cache results. - String concatenation in loops: use array.join() instead.
- SpreadsheetApp.flush(): only call when truly needed (before reading after writes).
- UrlFetchApp without caching: cache external API responses in Properties to avoid re-fetching.
A profile-everything approach:
console.time('readData');
const data = sheet.getDataRange().getValues();
console.timeEnd('readData');
console.time('processData');
const result = process(data);
console.timeEnd('processData');
console.time('writeData');
sheet.getRange(1, 1, result.length, result[0].length).setValues(result);
console.timeEnd('writeData');
The console timer reveals which phase dominates. Optimize the biggest first.
H2: Migrate to External Compute for Very Large Tasks
For tasks requiring hours of compute:
- Export data to CSV.
- Process externally (Python on Cloud Run, Google Colab, BigQuery).
- Import results back.
Apps Script is ideal for tasks under 30 minutes total. For multi-hour analytics, dedicated compute platforms are more cost-effective and reliable.
Information Gain Box: The Hidden 30-Day Trigger Quota
Here is the quota that catches background-processing teams: Apps Script enforces a 90-minute total execution time per day across all time-driven triggers for free accounts, and 360-1800 minutes for Workspace business plans.
If your batch processor runs 10 times a day at 6 minutes each, that’s 60 minutes — within free quota. Add a few more triggers or longer runs, and quota exhaustion silently disables the trigger for the day.
To monitor:
1. Apps Script editor → Executions shows runtime per execution.
2. Sum across your triggers per 24-hour window.
3. If approaching quota, redesign to use fewer or shorter runs.
For very heavy automation, business-tier Workspace plans expand quotas significantly. The free tier is for development; production should use a paid tier.
This quota is documented but easy to overlook. Teams scale up triggers and discover the limit only when daily jobs start failing without obvious cause.
Comparison Table: Wrong Way vs. Correct Way
| Issue | Wrong Way | Correct Way |
|---|---|---|
| Per-cell read/write | getValue per cell |
Bulk getValues/setValues |
| Long task | Single execution | Batched with resume pattern |
| Recurring long task | Manual trigger | Time-driven trigger every N hours |
| Performance audit | Eyeball runtime | console.time() per operation |
| Memory issues | Process all data in RAM | Stream in batches |
| Hourly quota | Hope it doesn’t hit | Monitor and adjust trigger frequency |
| Very large datasets | Apps Script | External compute (Python, BigQuery) |
Original Image Descriptions
Screenshot 1: Show Apps Script editor with an error “Exceeded maximum execution time” after 6 minutes. The Executions panel shows the script ran for 360 seconds. The script processes a 50,000-row sheet. Draw a red circle around the timeout error and a red arrow to the runtime. Add a red annotation: “6-minute hard limit — single run cannot finish.”
Screenshot 2: Show the corrected script using batched processing with resume pattern. The Executions panel shows multiple successful runs of ~30 seconds each, with the progress flag in Script Properties. Draw a red circle around the batch loop and trigger-scheduling code. Add a red annotation: “Many short runs add up to complete the task.”
Frequently Asked Questions
Q: How is the 6-minute limit measured?
A: It’s wall-clock time from when the function starts executing until it returns. Network calls, sleeps, and computation all count. The limit is enforced strictly — at 6 minutes (or 30 for business), the execution is killed mid-statement.
Q: Can I use multi-threading to speed up Apps Script?
A: No — Apps Script is single-threaded. Each function executes serially. The only “parallelism” is through multiple independent trigger executions, but each still runs single-threaded within itself.
Q: Does the time limit apply to custom functions in cells?
A: Yes, and it’s tighter — 30 seconds for custom functions. For long operations, refactor as installed triggers writing to cells, then have custom functions read from those cells. This pattern avoids both the custom function 30-second limit and provides better performance.