Responsive Drag-and-Drop File Uploader (Blogger-ready)
Drag & drop files here or
Supports multiple files, image preview & compression. Max file size: 50 MB. Allowed: images & common documents
Responsive Drag-and-Drop File Uploader — Features, How it Works & FAQ
Introduction
This drag-and-drop file uploader provides a modern, responsive and accessible experience for uploading multiple files from the browser. Built with plain HTML, CSS and JavaScript, it contains advanced features you expect in production: per-file progress bars, chunked uploads with automatic retry and pause/resume controls, client-side image compression and previews, file type and size validation, cancel support, and a responsive UI suitable for Blogger posts. Use this component to give your visitors a fast and reliable upload experience with clear visual feedback.
Description
The uploader exposes a clear drop zone and fallback browse control so users can drag & drop or pick files. Each selected file becomes a managed upload item showing an image preview (if applicable), filename, size, a progress bar, and control buttons to pause/resume, retry, or cancel the upload. Files are uploaded in chunks to reduce the impact of network interruptions — each chunk can be retried a configurable number of times before reporting a failure. Image files are automatically compressed client-side using the HTML canvas when the file exceeds the desired dimensions or quality threshold, reducing upload time and bandwidth.
Key behaviors include:
- Multiple file support: Upload many files at once with per-file state.
- Chunked upload: Large files are split into smaller chunks, enabling resume from the last successful chunk if an interruption occurs.
- Pause/resume & cancel: Users can pause or resume individual uploads or cancel them entirely.
- Retries: Network errors automatically trigger a few retries for transient faults, improving reliability.
- Client-side compression: Images are resized and compressed on the client before upload to save bandwidth.
- Validation: File type and size validation prevents unwanted uploads and gives immediate feedback.
- Responsive UI: Designed mobile-first and responsive for Blogger templates.
Implementation notes: the code includes a simple chunk-upload stub so you can integrate it with your backend by setting UPLOAD_ENDPOINT
. The uploader uses the Fetch API and AbortController
to allow pause/resume and cancellation. Image compression leverages an offscreen canvas and HTMLCanvasElement.toBlob()
with adjustable quality. Validation settings are customizable from the top of the script.
How to use in Blogger
- Create a new post and switch to HTML mode. Paste the entire block of code into the post content. Blogger allows inline scripts; if your Blogger theme blocks inline scripts, host the JS externally and reference it in the post.
- Customize
UPLOAD_ENDPOINT
and any validation variables for your server. - Test uploads on desktop and mobile. The UI is responsive and uses progressive enhancement to still function as a simple file input where JavaScript might be blocked.
FAQ
- Q: Does this uploader require a specific backend?
- A: No. The frontend code uses a chunked upload protocol that posts FormData chunks to an endpoint and sends metadata (file id, chunk index). You will need to implement server-side handling to assemble chunks or use a third-party service that accepts chunked uploads.
- Q: How do I change allowed file types and max size?
- A: At the top of the script there are
ALLOWED_TYPES
andMAX_FILE_SIZE
variables. Adjust them to match your requirements. - Q: Is client-side compression lossy?
- A: Yes — compression reduces quality to decrease size. You can tune width/height and quality to control the trade-off between size and fidelity.
- Q: What happens if a chunk fails?
- A: The uploader retries the chunk up to a configurable number of times. If it still fails, the file's state becomes "failed" and you can retry manually.
- Q: Can users resume uploads after closing the page?
- A: This demo stores no persistent state. To enable cross-session resume you must store uploaded chunk metadata on the server and client (e.g., localStorage + server file id) so the client can query which chunks are already received and continue from the last successful chunk.