⚙️

Data Processing Optimization in No Code

Jul 11, 2025

Summary

  • The meeting covered advanced techniques for reducing make.com operations by offloading data processing to Airtable automations and scripts.
  • Three methods were compared: direct processing in make.com, sending raw data to Airtable for scripting, and handling large datasets by uploading JSON to Google Cloud Storage and then attaching the file in Airtable.
  • Key recommendations were discussed to shift heavy data iteration from make.com (which bills per operation) to Airtable (which bills per automation run), with caveats regarding Airtable's script time limits and text field size.
  • Automation blueprints and further resources were mentioned as available in a community workspace.

Action Items

  • None identified with specific owners or due dates in this transcript.

Optimizing Make.com Operations by Shifting Data Processing

  • Explained standard way: API calls in make.com process and loop over large datasets, incurring high operation counts and costs.
  • Demonstrated that each module and each iteration in make.com increases operation usage, potentially amounting to thousands of operations for complex automations.
  • Outlined three data-processing patterns:
    1. Entire data processing in make.com (costly in operations).
    2. Raw API data uploaded to Airtable, with processing via Airtable script automations.
    3. For very large datasets, JSON is uploaded to cloud storage (Google Cloud Storage), and file is attached in Airtable; processing is done via Airtable script that downloads and parses the file.

Using Airtable Automations for Efficient Data Handling

  • Processing data via Airtable automations only incurs one automation run, regardless of the number of records or operations within the script (subject to the 30-second script execution time limit).
  • Capitalization and other post-processing can be performed efficiently in Airtable scripting as opposed to module chaining in make.com.
  • When using the direct text column for JSON, limits apply to how much JSON can be stored in Airtable's long text field.
  • For datasets that exceed this limit, uploading to cloud storage and linking via attachment in Airtable is a viable workaround.
  • Airtable automation triggers are set based on HTTP status code and other fields to ensure only successful and unprocessed data is handled.

Scalability and Limitations

  • Airtable scripts have a 30-second execution limit—large data volumes may need to be split across multiple automation runs.
  • The file attachment method addresses the limitation of text field size when handling large API payloads.
  • Both Airtable approaches drastically reduce make.com operational expenses.

Resources and Next Steps

  • Importable make.com blueprint and Airtable base are available in the "No Code Architects" community.
  • Additional support, workshops, and automation libraries are accessible via the community for deeper learning and troubleshooting.

Decisions

  • Shift heavy data processing from make.com to Airtable automations — Rationale: saves thousands of operations (and dollars), as Airtable charges per automation run rather than per operation.

Open Questions / Follow-Ups

  • What is the exact character limit for Airtable's long text field when storing JSON?
  • Are there performance benchmarks for the largest volume of posts that can be processed within Airtable's 30-second script execution window?