Latenode

Automate Facebook group data collection to Supabase

This automation allows users to scrape posts, comments, and sub-comments from a Facebook group, and automatically save the extracted data into a Supabase database.

This can be used to gather user engagement data for analysis, archive posts and comments for research, or monitor community sentiment by collecting feedback across discussions. The workflow is triggered by a form submission, where the user enters a Facebook group URL and the number of posts to scrape. It then uses Apify to scrape the specified posts, comments, and sub-comments, and stores the data in Supabase tables for further processing and analysis.

Updated Apr 6, 2026Est. run: 26sEst. cost: $0.0703
How Latenode estimates time and cost

Latenode bills workflow runs in credits: 1 credit = 30 seconds of processing. Minimum charge per run depends on your plan. Plug-and-Play (PnP) AI nodes are billed separately—each PnP token is $1 USD, charged pay-as-you-go at vendor cost plus a small processing fee, with no API keys required.

Full pricing — how credits work →
Scraping & data collection

Workflow preview

What this template does

  • Extracts posts, comments, and sub-comments from a Facebook group
  • Normalizes and stores the extracted data into a Supabase database
  • Filters and deduplicates the data to provide clean, analysis-ready datasets
  • Triggers the workflow using a form where users input a Facebook group URL and post count
  • Generates a Supabase database with tables containing the scraped Facebook data

How it works

1
Trigger

Trigger: User submits form

The automation is triggered when a user submits a form, providing the Facebook group URL and the number of posts to scrape.

2
Action

Configure Facebook group URL

The provided Facebook group URL is used to fetch the specified number of posts from the group.

3
Action

Scrape Facebook group posts

The automation uses Apify to scrape the specified number of posts from the Facebook group.

4
Action

Extract post data

The scraped post data is transformed and extracted, preparing it for storage in the Supabase database.

5
Action

Save post data to Supabase

The extracted post data is saved to the 'posts' table in the Supabase database.

6
Logic

For each post from previous step

The automation checks if each post from the previous step has any comments.

7
Logic

Post has comments?

If the post has comments, the automation proceeds to scrape the comments.

8
Action

Scrape post comments

The automation uses Apify to scrape the comments for the current post.

9
Action

Extract comment data

The scraped comment data is transformed and extracted, preparing it for storage in the Supabase database.

10
Action

Save comment data to Supabase

The extracted comment data is saved to the 'comments' table in the Supabase database.

11
Logic

For each comment from previous step

The automation checks if each comment from the previous step has any sub-comments.

12
Action

Scrape comment sub-comments

The automation uses Apify to scrape the sub-comments for the current comment.

13
Action

Extract sub-comment data

The scraped sub-comment data is transformed and extracted, preparing it for storage in the Supabase database.

14
Action

Save sub-comment data to Supabase

The extracted sub-comment data is saved to the 'replies' table in the Supabase database.

Setup guide

1

Add Apify credential

1. In the Latenode Credentials panel, add a new credential for Apify. 2. Enter your Apify API key.

2

Configure Apify node

1. In the Latenode visual builder, add an Apify node. 2. Select the Apify credential you created earlier. 3. Configure the Apify actor, input Facebook group URL, and number of posts to scrape.

3

Add Supabase credential

1. In the Latenode Credentials panel, add a new credential for Supabase. 2. Enter your Supabase project URL and API key.

4

Configure Supabase nodes

1. In the Latenode visual builder, add a Supabase node. 2. Select the Supabase credential you created earlier. 3. Configure the Supabase database connection and table names for posts, comments, and replies.

5

Add form trigger

1. In the Latenode visual builder, add a Form Trigger node. 2. Configure the form fields for Facebook group URL and number of posts to scrape.

Requirements

Apify account with API key
Supabase account with API key and database access
Allowed Apify crawl on the target Facebook group URL
Latenode workspace with Facebook scraper and Supabase integration nodes

FAQ

Common questions about this template

Each run uses credits on your Latenode plan. We charge for processing time (1 credit = 30 seconds). Your actual cost depends on your plan and how long the run takes. See pricing plans for plans and how credits work.

More templates

You might also like

Browse all templates →
Scraping & data collection

Automatically Sync Google Maps Business Data to a Spreadsheet

This automation workflow allows users to efficiently scrape business data from Google Maps, including names, contact details, and reviews, and export the structured information into a spreadsheet or database. The workflow uses the SerpAPI service to retrieve Google Maps search results, which are then transformed and appended to a Google Sheet. This enables users to generate leads, conduct market analysis, and gain valuable insights from the collected data in a cost-effective and scalable manner.

26s$0.0703
Scraping & data collection

Scrape Zillow property data and sync to Google Sheets automatically

This Latenode automation extracts real estate listing details from Zillow and automatically populates a Google Sheets spreadsheet with the property data. It leverages the Scrape.do web scraping API to bypass anti-bot protections and fetch the full HTML of Zillow listings, then parses key information like price, address, days on Zillow, and Zestimate, and saves the structured results into a Google Sheet. This solution is designed for real estate professionals, investors, and market analysts who need to collect property data at scale without manual effort, enabling better market research, portfolio tracking, and lead generation.

9s$0.0006
Scraping & data collection

Automatically scrape and store Booking.com hotel data

This workflow automates the search and extraction of hotel data from Booking.com, triggered by a chat message. It uses a combination of web scraping with Bright Data's Web Scraper and AI-powered data processing with OpenRouter to deliver a concise, human-friendly list of hotels, including the title, address, original price, and final price. The final output is a clean and formatted report, making it a valuable tool for travelers, event planners, and business professionals who need to quickly find accommodation.

26s$0.0703