Creating a scraper for expireddomains.net is a bit more complex than a standard site because the URL you provided is behind a member login (/member/). A simple file_get_contents will not work because the server will redirect the request to a login page.
To make this work in PHP, we need to use cURL to handle session cookies and headers, and DOMDocument/XPath to parse the data.
Important Considerations:
- Terms of Service: ExpiredDomains.net generally prohibits automated scraping. Use this responsibly and check their API if you plan to do this frequently.
- Authentication: You will need to export your session cookies from your browser and provide them to the script, or the script will only see the login page.
Here is a functional PHP template for this scraper.
<?php
/** - Web Scraper for Expired Domains
- Note: Requires your session cookie to access member-only links.
*/
// 1. The URL you provided
$url = “https://member.expireddomains.net/domains/combinedexpired/?savedsearch_id=588461&flast12=1&flimit=200&fbl=5&fblm=44444&fnocctlds=1&ftldsallow=.xyz+.club+.website+.shop+.store+.site%2C+.online%2C+.tech%2C+.space%2C+.fun&fwhois=22&o=bl&r=d&fsa=1767624765”;
// 2. Setup cURL
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, $url);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
curl_setopt($ch, CURLOPT_FOLLOWLOCATION, true);
// It is essential to mimic a real browser header
curl_setopt($ch, CURLOPT_USERAGENT, ‘Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/120.0.0.0 Safari/537.36’);
// PASTE YOUR SESSION COOKIE HERE
// You can find this in your browser: F12 -> Application -> Cookies -> sessionid
$cookie = ‘sessionid=YOUR_ACTUAL_SESSION_ID_HERE;’;
curl_setopt($ch, CURLOPT_COOKIE, $cookie);
$html = curl_exec($ch);
if (curl_errno($ch)) {
echo ‘Error:’ . curl_error($ch);
die();
}
curl_close($ch);
// 3. Parse the HTML
$dom = new DOMDocument();
@$dom->loadHTML($html);
$xpath = new DOMXPath($dom);
// ExpiredDomains typically uses a table with class ‘base1’
// We look for the domain names which are usually in an tag inside the table
$domains = $xpath->query(“//table[contains(@class, ‘base1’)]//tr/td[1]/a”);
$list = [];
foreach ($domains as $node) {
// Clean up the text
$domainName = trim($node->nodeValue);
if (!empty($domainName)) {
$list[] = $domainName;
}
}
// 4. Output the results
// Following your preference: no bullets or numbers, just new lines.
if (!empty($list)) {
echo implode(“\n”, $list);
} else {
echo “No domains found. Check your session cookie/login status.”;
}
?>
How to use this for your PHP web pages:
- Get your Session ID: Login to ExpiredDomains in Chrome/Edge. Press F12, go to the Application tab, click Cookies, and copy the value of the sessionid. Paste it into the $cookie variable in the script.
- HTML/PHP Integration: Since you create informational web pages, you can wrap this logic in a simple container.
- Formatting: I have set the output to implode(“\n”, $list), which ensures that the domain names are listed one per line without any bullet points or numbers, as per your formatting preference.
If they block cURL:
ExpiredDomains often uses Cloudflare protection. If the script above returns a 403 error, you may need to use a tool like Puppeteer or Selenium (via a PHP wrapper like symfony/panther) because they can handle the JavaScript challenges that simple cURL cannot.
Leave a Reply