In this tutorial, we will see how you can use ScrapingBee’s API with Ruby, and use it to scrape web pages. As such, we will cover these topics:
- General structure of an API request
- Create your first API request.
Let’s get started!
1. General structure of an API request
The general structure of an API request made in Ruby will always look like this:
require 'net/http'
require 'net/https'
# Classic (GET)
def send_request
api_key = "YOUR-API-KEY"
user_url = "YOUR-URL"
uri = URI('https://app.scrapingbee.com/api/v1/?api_key='+api_key+'&url='+user_url)
# Create client
http = Net::HTTP.new(uri.host, uri.port)
http.use_ssl = true
http.verify_mode = OpenSSL::SSL::VERIFY_PEER
# Create Request
req = Net::HTTP::Get.new(uri)
# Fetch Request
res = http.request(req)
# Return Response
return res
rescue StandardError => e
puts "HTTP Request failed (#{ e.message })"
end
And you can do whatever you want with the response variable! For example:
request = send_request()
puts "Response HTTP Status Code: #{ request.code }"
puts "Response HTTP Response Body: #{ request.body }"
2. Create your first API request:
Let’s create a tool that saves the HTML code of ScrapingBee’s blog:
require 'net/http'
require 'net/https'
require 'addressable/uri'
# Classic (GET)
def send_request(user_url)
uri = Addressable::URI.parse("https://app.scrapingbee.com/api/v1/")
api_key = "YOUR-API-KEY"
uri.query_values = {
'api_key' => api_key,
'url' => user_url
}
uri = URI(uri)
# Create client
http = Net::HTTP.new(uri.host, uri.port)
http.use_ssl = true
http.verify_mode = OpenSSL::SSL::VERIFY_PEER
# Create Request
req = Net::HTTP::Get.new(uri)
# Fetch Request
res = http.request(req)
# Return Response
return res
rescue StandardError => e
puts "HTTP Request failed (#{ e.message })"
end
request = send_request("https://scrapingbee.com/blog")
puts "Response HTTP Status Code: #{ request.code }"
File.open("blog.html", 'w') { |file| file.write(request.body) }
Go back to tutorials