Zero-dependency, stream-first file upload handler for Node.js. Plugin-based architecture for unlimited flexibility.
const FluxUpload = require('fluxupload');
const uploader = new FluxUpload({
storage: 'local',
destination: './uploads',
limits: { fileSize: 50 * 1024 * 1024 }
});
app.post('/upload', async (req, res) => {
const result = await uploader.handle(req);
res.json({ files: result.files });
});
Built for modern Node.js applications with performance and security in mind.
Pure Node.js implementation using only native modules. No supply chain risks, faster installs.
Never buffers entire files in memory. Constant O(1) memory usage regardless of file size.
Micro-kernel design with validators, transformers, and storage adapters. Extend as needed.
Magic byte verification, CSRF protection, path traversal prevention, atomic writes.
Local filesystem, AWS S3, or both simultaneously. Native Signature V4 without aws-sdk.
Built-in logging, metrics collection, progress tracking, and health checks.
A minimal core that orchestrates plugins. Each plugin does one thing well.
Check files before processing: size limits, file types, rate limiting, CSRF.
Process streams: compute checksums, compress, resize images on the fly.
Save files: local disk with atomic writes, S3 with presigned URLs.
const uploader = new FluxUpload({
// Validators run first
validators: [
new QuotaLimiter({ maxFileSize: 100*MB }),
new MagicByteDetector(),
new RateLimiter({ limit: 100 })
],
// Transformers process the stream
transformers: [
new StreamHasher({ algorithm: 'sha256' })
],
// Storage saves the file
storage: new LocalStorage({
destination: './uploads',
naming: 'uuid'
})
});
Everything you need out of the box. Mix and match to build your pipeline.
Install FluxUpload and handle file uploads the right way.
npm install fluxupload