Latenode

Automatically scrape and store Booking.com hotel data

This workflow automates the search and extraction of hotel data from Booking.com, triggered by a chat message.

It uses a combination of web scraping with Bright Data's Web Scraper and AI-powered data processing with OpenRouter to deliver a concise, human-friendly list of hotels, including the title, address, original price, and final price. The final output is a clean and formatted report, making it a valuable tool for travelers, event planners, and business professionals who need to quickly find accommodation.

Updated Apr 12, 2026Est. run: 26sEst. cost: $0.0703
How Latenode estimates time and cost

Latenode bills workflow runs in credits: 1 credit = 30 seconds of processing. Minimum charge per run depends on your plan. Plug-and-Play (PnP) AI nodes are billed separately—each PnP token is $1 USD, charged pay-as-you-go at vendor cost plus a small processing fee, with no API keys required.

Full pricing — how credits work →
Scraping & data collection

Workflow preview

What this template does

  • Extracts hotel names and pricing data from Booking.com
  • Normalizes the extracted data into a structured format
  • Stores the processed data in a database or spreadsheet
  • Filters the data based on user-defined criteria
  • Generates a formatted report with the processed hotel information

How it works

1
Trigger

Chat Trigger

The workflow is triggered by a user chat message, initiating the search and extraction of hotel data from Booking.com.

2
Logic

Process Chat Request

The user's chat message is analyzed by OpenRouter's AI to understand the intent behind the request, such as finding hotel options for a specific location.

3
Action

Scrape Hotel Listings

Bright Data's Web Scraper is used to initiate a batch scrape on Booking.com, targeting hotel listings for the location specified in the user's request.

4
Logic

Wait for Batch Scrape

The system monitors the status of the batch scrape on Booking.com until the process is completed, ensuring all relevant hotel data has been collected.

5
Action

Download Scrape Results

Once the batch scrape is finished, the system fetches the snapshot data containing the hotel listings from Bright Data.

6
Action

Clean & Format Data

JavaScript code is used to extract the key information from the hotel listings, such as the title, address, original price, and final price.

7
AI

Summarize Hotel Data

OpenRouter's AI is leveraged to generate a human-friendly summary of the hotel listings, presenting the information in a clear and concise format.

8
Action

Store Hotel Data

The formatted and summarized hotel listings are saved to a database using Supabase, making the information easily accessible for future reference.

Setup guide

1

Add Bright Data credential

1. In the Latenode Credentials panel, add a new credential for Bright Data. Enter your Bright Data API key.

2

Add OpenRouter credential

1. In the Latenode Credentials panel, add a new credential for OpenRouter. Enter your OpenRouter API key.

3

Configure Bright Data Web Scraper node

1. Add a Bright Data Web Scraper node to your workflow. 2. In the node settings, select the Bright Data credential you created earlier. 3. Configure the web scraping parameters, such as the Booking.com URL, search filters, and other details.

4

Configure OpenRouter LLM Chat node

1. Add an OpenRouter LLM Chat node to your workflow. 2. In the node settings, select the OpenRouter credential you created earlier. 3. Map the input parameters, such as the chat message, to the node's input fields.

5

Configure Supabase node (optional)

1. If you want to store the extracted hotel data in a Supabase database, add a Supabase node to your workflow. 2. In the node settings, add your Supabase connection details, including the project URL and API key from the Latenode Secrets panel. 3. Map the hotel data to the appropriate Supabase table columns.

Requirements

Bright Data Web Scraper account with API access
OpenRouter account with API access
Supabase account with database access
Ability to configure the Bright Data web scraper to crawl Booking.com hotel listings
For providers without a native connector in Latenode, use the JavaScript step with that service's API credentials (stored in Latenode Keys / Secrets).

FAQ

Common questions about this template

Each run uses credits on your Latenode plan. We charge for processing time (1 credit = 30 seconds). Your actual cost depends on your plan and how long the run takes. See pricing plans for plans and how credits work.

More templates

You might also like

Browse all templates →
Scraping & data collection

Automatically Sync Google Maps Business Data to a Spreadsheet

This automation workflow allows users to efficiently scrape business data from Google Maps, including names, contact details, and reviews, and export the structured information into a spreadsheet or database. The workflow uses the SerpAPI service to retrieve Google Maps search results, which are then transformed and appended to a Google Sheet. This enables users to generate leads, conduct market analysis, and gain valuable insights from the collected data in a cost-effective and scalable manner.

26s$0.0703
Scraping & data collection

Scrape Zillow property data and sync to Google Sheets automatically

This Latenode automation extracts real estate listing details from Zillow and automatically populates a Google Sheets spreadsheet with the property data. It leverages the Scrape.do web scraping API to bypass anti-bot protections and fetch the full HTML of Zillow listings, then parses key information like price, address, days on Zillow, and Zestimate, and saves the structured results into a Google Sheet. This solution is designed for real estate professionals, investors, and market analysts who need to collect property data at scale without manual effort, enabling better market research, portfolio tracking, and lead generation.

9s$0.0006
Scraping & data collection

Scrape and sync Instagram profiles to Google Sheets

This automation template allows users to scrape comprehensive data from Instagram profiles and export the results into a Google Sheets spreadsheet for analysis. It utilizes the Apify web scraping tool to fetch the full profile details for a list of usernames, which are stored in a Google Sheet. The workflow runs on a scheduled basis, retrieving the unscrapped usernames, processing them in batches, and appending the scraped data to the Google Sheet. This streamlines the data collection process, enabling users to easily analyze and gain insights from their Instagram audience.

26s$0.0703