JavaScript doesn't run curl commands directly, but converting so-called cURL JavaScript snippets into real code is easier than it looks. This guide walks you through the whole process: how cURL works, how to translate its flags into fetch or Axios, how to grab curl commands from your browser, and how to turn them into clean, modern JavaScript you can drop straight into your project.
We'll keep everything simple and practical: short examples, clear steps, and tooling you can use right away.

Quick answer (TL;DR)
To convert a curl command into JavaScript, you take the same parts (URL, method, headers, and body) and drop them into a fetch call or an Axios request. Modern JavaScript covers everything curl does; you just write it in code instead of a terminal command.
cURL example
curl -X POST https://example.com/api \
-H "Content-Type: application/json" \
-d '{"msg":"hello"}'
JavaScript example (Fetch, ESM)
const res = await fetch("https://example.com/api", {
method: "POST",
headers: {
"Content-Type": "application/json"
},
body: JSON.stringify({ msg: "hello" })
});
const data = await res.json();
console.log(data);
If you want the fastest possible way to generate JavaScript automatically, use our tool: JavaScript Fetch cURL Converter
How cURL works in JavaScript
JavaScript can't run native curl commands — it's not a shell. But it can do everything curl does by using HTTP libraries.
In the browser you use fetch. In Node.js you also have fetch in modern versions, or you can use tools like Axios. You make the same GET/POST requests, send headers, work with JSON, handle forms: all the usual curl behavior, just written in JavaScript instead of a terminal command.
If you want to easily convert
curlcommands to JavaScript, check out our cURL converter.
The idea is simple: curl is a CLI tool, JavaScript is a runtime. Different environments, same HTTP requests.
Using Fetch API in the browser
If you're working in the browser, converting a simple curl command to JavaScript usually means writing a fetch call. The idea is the same: define the URL, method, headers, and body.
fetch("https://example.com/api", {
method: "GET",
})
.then(res => res.json())
.then(data => console.log(data));
// async-await:
// const res = await fetch("https://example.com/api", {
// method: "GET",
// });
// const data = await res.json();
// console.log(data);
That's the browser-friendly equivalent of a basic curl request.
Using Fetch in Node.js
Node.js also supports fetch in modern versions (Node 18+), so turning a curl command into JavaScript looks almost the same as in the browser. If you're on an older Node version, you might need to install a fetch polyfill:
npm install node-fetch
And then:
// This line is not required on Node 18+
// Note that we're using ESM in all samples
import fetch from "node-fetch";
const res = await fetch("https://example.com/api");
const data = await res.json();
console.log(data);
Same flow as curl, just written in JavaScript.
Learn how to use proxy with Node fetch in our tutorial.
Using popular cURL alternatives in JavaScript
JavaScript doesn't run curl directly, so you lean on libraries that handle HTTP for you. This is completely normal in modern development. Tools like Fetch, Axios, and even higher-level APIs let you send requests with the same control you'd have in curl.
Services such as ScrapingBee also plug right into these clients, so you can call their API with Fetch, Axios, or anything else you prefer.
Using Axios to replace cURL
Axios is a good drop-in replacement when you want something more structured than raw fetch. You set the URL, method, headers, and body the same way you would with curl.
// Install with:
// npm install axios
import axios from "axios";
const res = await axios.post(
"https://example.com/api",
{ message: "hello" },
{
headers: {
"Content-Type": "application/json"
}
}
);
console.log(res.data);
Same idea as curl -X POST -H "Content-Type: application/json" -d '{"message":"hello"}' https://example.com/api, just written in JavaScript.
Using Ky as a lightweight cURL alternative
Ky is a small, modern HTTP client built on top of fetch. It keeps things minimal but gives you a nicer API and good defaults. If you want something lighter than Axios but still more comfortable than raw fetch, Ky fits well.
// Install with:
// npm install ky
import ky from "ky";
const data = await ky.post("https://example.com/api", {
json: { message: "hello" },
}).json();
console.log(data);
This is the same idea as a curl -X POST -H "Content-Type: application/json" -d '{"message":"hello"}' https://example.com/api, but written with a clean, compact Ky call.
How to extract cURL commands from your browser
Modern browsers let you copy any network request as a curl command. This is super handy when you want to debug an API call, recreate a request outside the app, or understand what headers and payloads the frontend is actually sending.
If you want a deeper dive, here's a guide on how to do it in Chrome:
Extract cURL from Chrome
Chrome, Safari, and Firefox steps
Chrome
- Open DevTools → Network tab.
- Trigger the request.
- Right-click the entry → Copy → Copy as cURL.
Safari
Safari hides DevTools by default, so enable the Develop menu first.
Full walkthrough here: Extract cURL from Safari
- Open Develop → Show Web Inspector → Network.
- Reload the page.
- Right-click the request → Copy as cURL.
Firefox
Firefox also supports this, and the process is almost identical.
More details: Extract cURL from Firefox
- Open DevTools → Network.
- Run the request.
- Right-click → Copy → Copy as cURL.
Convert browser cURL to JavaScript
Once you have the copied curl command, turning it into JavaScript is straightforward. You look at the method, headers, and body, then map them into a fetch call.
Before (cURL):
curl -X POST https://example.com/api \
-H "Content-Type: application/json" \
-d '{"msg":"hello"}'
After (JavaScript):
const res = await fetch("https://example.com/api", {
method: "POST",
headers: {
"Content-Type": "application/json"
},
body: JSON.stringify({ msg: "hello" })
});
const data = await res.json();
console.log(data);
This is the typical workflow: extract the request in curl, convert it, and drop it into your JavaScript code. Alternatively, you can take advantage of our cURL converter online tool.
Running real cURL from Node.js with subprocesses
Most of the time you'll translate curl into fetch or Axios. But if you really want to run native curl from JavaScript, you can do it in Node.js using subprocesses.
The idea is simple:
- Node.js calls the operating system.
- The OS runs
curl. - You read the output back in your Node code.
For this to work you need:
curlinstalled on the machine.curlavailable inPATH(socurl ...works in the terminal).- A Node.js environment (this will never work in the browser).
Here's a small example using child_process.exec in ESM:
import { exec } from "node:child_process";
import { promisify } from "node:util";
const execAsync = promisify(exec);
try {
const { stdout } = await execAsync(
'curl -s -X GET "https://example.com/api" -H "Accept: application/json"'
);
console.log("Raw cURL output:", stdout);
} catch (err) {
console.error("cURL failed:", err);
}
This runs real curl under the hood and gives you the CLI output as a string. From there you can parse JSON if the API returns it:
import { exec } from "node:child_process";
import { promisify } from "node:util";
const execAsync = promisify(exec);
try {
const { stdout } = await execAsync(
'curl -s -X GET "https://example.com/api" -H "Accept: application/json"'
);
const json = JSON.parse(stdout);
console.log(json);
} catch (err) {
console.error("cURL failed:", err);
}
This approach is handy when:
- You already have a curl command that works and just want to reuse it.
- You are writing quick scripts or debugging.
For regular application code, it's usually better to stick with fetch or Axios:
- No dependency on curl being installed.
- More portable across environments.
- Easier error handling and testing.
Be very careful when running scripts containing such
child_processcalls, especially if you don't understand how they work. These functions let Node.js execute real OS commands (manage files, run programs, access system utilities). A single unsafe command or untrusted input can lead to data loss or full system compromise.
Using node-libcurl for cURL-style requests in Node.js
If you want something very close to real curl behavior in Node.js, you can use node-libcurl. It's a Node binding around the native libcurl library, so you get a lot of the same features and options that curl exposes.
Installation
node-libcurl isn't a pure JavaScript library — it uses native bindings. That means it talks directly to the underlying libcurl C library through compiled code. Because of this, there are a few requirements:
- You must be running Node.js, not the browser.
- Your system needs the tools required to compile native addons (Python, make, C/C++ toolchain).
- On some platforms, libcurl or SSL libraries may need to be present or will be bundled automatically during installation.
You install it like any other package:
npm install node-libcurl
Once installed, everything works normally: you import it and call it just like any other library, but the heavy lifting happens in compiled code underneath.
Simple GET request with curly
The easiest way to start is with the high-level curly API. In pure ESM you can pull it in via createRequire:
import { createRequire } from "node:module";
const require = createRequire(import.meta.url);
const { curly } = require("node-libcurl");
const { statusCode, data, headers } = await curly.get(
"https://example.com/api",
{
httpHeader: [
"Accept: application/json"
]
}
);
console.log("Status:", statusCode);
console.log("Headers:", headers);
console.log("Body:", data.toString());
This is roughly the same as:
curl -H "Accept: application/json" https://example.com/api
but you stay inside Node and get structured data back.
JSON POST request with node-libcurl
Here's a small JSON POST example, again using curly:
import { createRequire } from "node:module";
const require = createRequire(import.meta.url);
const { curly } = require("node-libcurl");
const payload = { message: "hello from node-libcurl" };
const { statusCode, data } = await curly.post(
"https://example.com/api",
{
postFields: JSON.stringify(payload),
httpHeader: [
"Content-Type: application/json",
"Accept: application/json"
]
}
);
console.log("Status:", statusCode);
console.log("Response:", data.toString());
Here you explicitly set headers and body, just like you would with a curl -X POST -H ... -d ... command. The difference is that you're talking to libcurl through JavaScript instead of invoking the curl binary, which can be useful when you want curl-level power without spawning subprocesses.
Conclusion
Turning curl commands into JavaScript isn't complicated once you know how the pieces map over. fetch, Axios, Ky — they all give you the same control curl does, just in code form. Whether you're converting a cURL JavaScript snippet, debugging a network call, or building a client from scratch, the process stays simple.
And if you really need the actual CLI version, Node.js can run native curl through subprocesses as long as it's installed on the system. Alternatively, you can take advantage of node-libcurl.
Use whatever fits your workflow: copy a curl command from DevTools, convert it to JavaScript, drop it into your project, and you're good to go. Modern tooling makes the whole process smooth, whether you're debugging, learning how a request works, or building a production-ready client.
Before you go, check out these related reads:
- Mastering the Python curl request: A practical guide for developers
- Best 10 Java Web Scraping Libraries
Frequently asked questions (FAQs)
Can JavaScript run native cURL?
JavaScript itself can't run "real" curl because the language doesn't execute shell commands. But it can fully replicate what curl does by sending HTTP requests, setting headers, posting JSON or form data, handling cookies, and so on.
- In the browser you use
fetch. - In Node.js you normally use
fetch, Axios, or another HTTP client like Ky.
There is an exception though: Node.js is able to call the operating system, so you can run actual curl through child_process.exec or spawn. This works, but it's not common for production code because it depends on the OS, requires curl to be installed, and returns raw CLI output instead of structured data. It's mostly useful for quick scripts or debugging.
On top of that, you can use node-libcurl but it requires native bindings, runs only on Node.js, and the installation can be more involved.
If you don't want to deal with low-level headers or proxy handling, you can also call a service like ScrapingBee's Web Scraping API from JavaScript using Fetch or Axios and let it handle the heavy lifting.
How do I convert cURL commands to Fetch code?
You read the curl command and map its pieces to a fetch call:
- URL → first argument to
fetch -Xor--request→method-Hor--header→headers-dor--data→body
Once you understand this mapping, turning curl into JavaScript fetch is mostly a matter of copying values over. You can also use an online cURL → JavaScript converter tool to generate fetch or Axios code from a command when you don't want to do it by hand.
What is the best way to copy cURL from my browser?
Use DevTools:
- Open DevTools and go to the Network tab.
- Trigger the request (reload the page or click the button that sends it).
- Right-click the request → Copy → Copy as cURL.
This gives you a ready-made curl command you can then convert into JavaScript. It's the fastest way to see what the browser is actually sending.
Should I use Axios, Fetch, or Request?
- Fetch is built into browsers and modern Node.js — great default, minimal, no deps.
- Axios adds nicer defaults, interceptors, and a richer API.
- Request is deprecated and only makes sense in old projects.
- Ky is a lightweight wrapper around fetch with a cleaner interface.
For new code, pick Fetch for simplicity or Axios/Ky if you want a more ergonomic client.


