For most websites, your first requests will always be successful, however, it’s inevitable that some of them will fail. For these failed requests, the API will return a 500 status code and won’t charge you for the request.
In this case, we can make our code retry to make the requests until we reach a maximum number of retries that we set:
<?php
// Get cURL resource
$ch = curl_init();
// Set base url & API key
$BASE_URL = "https://app.scrapingbee.com/api/v1/?";
$API_KEY = "YOUR-API-KEY";
// Set max retries:
$MAX_RETRIES = 5;
// Set parameters
$parameters = array(
'api_key' => $API_KEY,
'url' => 'https://www.scrapingbee.com' // The URL to scrape
);
// Building the URL query
$query = http_build_query($parameters);
// Set the URL for cURL
curl_setopt($ch, CURLOPT_URL, $BASE_URL.$query);
// Set method
curl_setopt($ch, CURLOPT_CUSTOMREQUEST, 'GET');
// Return the transfer as a string
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
for ($i = 0; $i < $MAX_RETRIES; $i++) {
// Send the request and save response to $response
$response = curl_exec($ch);
// Stop if fails
if (!$response) {
die('Error: "' . curl_error($ch) . '" - Code: ' . curl_errno($ch));
}
$status_code = curl_getinfo($ch, CURLINFO_HTTP_CODE);
echo 'HTTP Status Code: ' . $status_code . PHP_EOL;
// If it's a successful request (200 or 404 status code):
if (in_array($status_code, array(200, 404))) {
echo 'Response Body: ' . $response . PHP_EOL;
break;
} else {
echo 'Retrying...';
}
}
// Close curl resource to free up system resources
curl_close($ch);
?>
Go back to tutorials