Latenode

Automatically scrape Facebook group content and store in Supabase

This Latenode automation template allows users to scrape posts, comments, and sub-comments from a Facebook group and automatically save the extracted data into a Supabase database.

The workflow leverages the Apify scraper to retrieve the Facebook group content, which is then structured and stored in Supabase, creating records for posts, comments, and sub-comments. This provides a way to gather user engagement data, archive discussions, and monitor community sentiment, making it useful for analysis, research, or customer feedback tracking.

Updated Apr 2, 2026Est. run: 9sEst. cost: $0.0006
How Latenode estimates time and cost

Latenode bills workflow runs in credits: 1 credit = 30 seconds of processing. Minimum charge per run depends on your plan. Plug-and-Play (PnP) AI nodes are billed separately—each PnP token is $1 USD, charged pay-as-you-go at vendor cost plus a small processing fee, with no API keys required.

Full pricing — how credits work →
Scraping & data collection

Workflow preview

What this template does

  • Extracts posts, comments, and sub-comments from a Facebook group
  • Normalizes the extracted data into a structured format
  • Stores the normalized data in a Supabase database
  • Enables monitoring of community engagement and sentiment
  • Creates a comprehensive archive of group discussions

How it works

1
Trigger

Fetch Facebook Group

Retrieve posts from the target Facebook group using the Facebook integration.

2
Action

Scrape Page Data

Use a headless browser to extract post, comment, and sub-comment data from the Facebook group.

3
Logic

Structure Data

Transform the scraped data into a consistent format, preparing it for storage in the Supabase database.

4
Action

Store in Supabase

Save the structured data to the Supabase database, creating records for posts, comments, and sub-comments.

Setup guide

1

Add Facebook credential in Latenode Credentials

Add your Facebook App ID and App Secret as a credential in the Latenode Credentials panel. This will allow the workflow to authenticate and access the Facebook group.

2

Configure Apify credential in Latenode Credentials

Add your Apify API key as a credential in the Latenode Credentials panel. This will enable the Apify scraper node to fetch data from the Facebook group.

3

Configure Supabase credential in Latenode Credentials

Add your Supabase API key and database URL as a credential in the Latenode Credentials panel. This will allow the workflow to store the scraped data in your Supabase database.

4

Set up Facebook group URL in Latenode node

In the Latenode builder, configure the Facebook group URL that you want to scrape in the Apify scraper node settings.

5

Customize data mapping in Latenode Supabase node

In the Latenode builder, map the fields from the Apify scraper output to the corresponding tables and columns in your Supabase database using the Supabase node settings.

Requirements

Set up a Supabase account and create a new database project
Obtain the Supabase connection details (URL, API key) and store them as Latenode workspace secrets
Configure a Facebook group you have access to and obtain the group URL
Authenticate your Latenode workspace with the Apify platform and create an Apify actor for scraping the Facebook group
Configure the Apify actor to scrape posts, comments, and sub-comments from the target Facebook group
Map the scraped data to Supabase table schemas for posts, comments, and sub-comments
Set up the Latenode workflow to execute the Apify actor, transform the data, and save it to the Supabase database

FAQ

Common questions about this template

Each run uses credits on your Latenode plan. We charge for processing time (1 credit = 30 seconds). Your actual cost depends on your plan and how long the run takes. See pricing plans for plans and how credits work.

More templates

You might also like

Browse all templates →
Scraping & data collection

Scrape and export Google Maps business data to Google Sheets

This automation allows users to efficiently scrape business data from Google Maps, including names, contact details, and reviews, and export the structured information into a spreadsheet or database for lead generation and market analysis. The workflow triggers manually or on a scheduled basis, fetching search results from Google Maps using the SerpAPI service, deduplicating the data, and writing it to a Google Sheet. This streamlines the process of gathering valuable business intelligence from Google Maps, empowering users to leverage this data for strategic decision-making and sales prospecting.

11s$0.0007
Scraping & data collection

Scrape Zillow property data and sync to Google Sheets automatically

This Latenode automation extracts real estate listing details from Zillow and automatically populates a Google Sheets spreadsheet with the property data. It leverages the Scrape.do web scraping API to bypass anti-bot protections and fetch the full HTML of Zillow listings, then parses key information like price, address, days on Zillow, and Zestimate, and saves the structured results into a Google Sheet. This solution is designed for real estate professionals, investors, and market analysts who need to collect property data at scale without manual effort, enabling better market research, portfolio tracking, and lead generation.

9s$0.0006
Scraping & data collection

Automate Instagram profile data export to Google Sheets

This automation template allows users to scrape comprehensive Instagram profile data using the Apify platform, and automatically export the results into a Google Sheets spreadsheet for analysis. It is designed to run on a schedule, processing a list of usernames by calling the Apify API, appending the scraped data to a Google Sheet, and marking the usernames as processed. The automation integrates with Google Sheets and the Apify actor, providing a streamlined way to gather and organize Instagram profile information without manual intervention.

15s$0.0009