4 Ways to Download All Files From a Folder on a Website or FTP
There are times when you will end on a web page that looks like a folder and you can only find files being listed. This is because the webserver directory index file (index.html, default.asp and etc) is not present on the folder and the directory listings option in the webserver is turned on. One of the reason to offer directory listings is to provide a convenient way for the visitor to quickly browse the files in the folders and allow them to easily download the files to their computer. Sometimes directory listings are accidental due to careless webmasters who forgot to include a .htaccess or blank index file to prevent all the files from being listed.
To download the file, you can either click on it or right click and select “Save link as” in Firefox/Chrome or “Save target as” in Internet Explorer. However, if you need to download multiple or even all of the files from the directory including the subfolders automatically, you will need third party tools to help you achieve that. Here are 4 different methods that you can use download all files from a folder on a website.
If you are a frequent downloader, you probably already have a download manager program installed. It can be an add-on such as the popular FlashGot and DownThemAll! for Firefox or an independent software such as Internet Download Manager (IDM) and Free Download Manager (FDM). To download all of the files in a web directory with the Firefox download manager extensions, right click at an empty space on the page, and select DownThemAll! or FlashGot All from the context menu.
The download manager will then list all the files that it manages to find and lets you pick the ones that you want to download to your computer.
Do take note that FlashGot and DownThemAll! can only download all files on a directory but cannot download folders recursively. As for Internet Download Manager, select “Download all links with IDM” from the right click context menu, check the files that you want to download from the list and click OK.
2. Wget
Wget is a free and very powerful file downloader that comes with a lot of useful features including resume support, recursive download, FTP/HTTPS support, and etc. In “The Social Network” movie, Mark Zuckerberg is seen using the Wget tool to download all the student photos from his university to create Facemash. Wget is a command line tool which can be a bit difficult to use for some basic users.
Thankfully there are free front-end GUI’s for Wget such as VisualWget that makes it so much easier to use Wget by simply clicking on the check boxes rather than manually typing the command line arguments. Download VisualWget, extract and run VisualWget.exe. Click on the New icon to open a New Download window. Enter the URL that you want to download and select the location that you want to save the files to.
If you need to download multiple folders including subfolders, go to Advanced, click on Recursive Retrieval and tick on the first checkbox “–recursive“. Finally click OK to start downloading.
Although there are a few GUIs for Wget, we recommend VisualWget because it is free, portable, comes together with Wget and there is no need to download separately and manually unpacking it to the program’s folder or even configuring the path.
To download more than just the visible files or use FTP, try the methods on Page 2.
3. Offline Browsers
Offline Browsers are actually tools that download the whole website for offline viewing. These pieces of software are capable of crawling into subfolders, downloading all or filtered files and then convert the live hyperlinks to offline version pointing to the downloaded HTML file that is on your hard drive. Some of the popular offline browsers are Offline Explorer and Teleport but they are shareware that comes with powerful parsing capabilities.
HTTrack is an excellent free alternative to the paid offline browsers. You can download the portable version of HTTrack, extract and run WinHTTrack.exe. Click Next, give the project a name and click Next.
Now enter or paste the URL(s) that you want to download to the Web Addresses box and click Next. If the URL requires authentication, click the Add URL button and you get to enter the login and password for the URL. Click Next and Finish. HTTrack will now start crawling the given URL and download files that it finds.
4. Downloading from FTP
The best program to download all files and subfolders from FTP server is obviously a FTP client software such as FileZilla Client because they understand FTP commands and are able to crawl recursively into subfolders without problems. In the FileZilla Client, all you need to do is enter the FTP address on the Host box, enter username and password if it requires authentication or leave it blank if not, and click the Quickconnect button.
Once you’re logged in, simply right click at the folder at the right pane and select Download which will start downloading all files and folders from the selected directory.
Please take note that FTP clients can only handle FTP protocol and they cannot download files from HTTP web pages.
Download FileZilla Client Portable
Thank you a LOT. In the past I used flashget mainly, but now it is mot compatible with latest fire fox, I don’t want to risk continue using older version of FireFox.
vwget did what I needed exactly, I would not have known about the recursive feature without your help. THANK YOU!
DownThemAll is a Firefox add-on which works fine for me.
The folks at the subreddit /r/opendirectories are using Felistar (www.moonstarsky.com), a tool built by another redditor. You should check it out
Thanks for the share, please keep it updated for GNU/Linux users.
I am trying to download multiple files from a facebook group. I only need the svg files. What is my best option?
very helpful thanks bro
thanks very much for your info…
I like vwget personally…
really small and effective tool
Thanks a lot. Personally I found VisualWget to be the best option.
Super helpful!
Very Useful. Thank you.
Sweet…
Filezzila worked for me! :)
thanks mate. This was helpful!
Really appreciate this!
Good on ya!
I tried no.4 (FileZilla) and it worked like a charm. Thanks.
really helpful…..and helps save time
Superb ..it did the trick !!!
Thanks this is very useful. You are a good man.
Raymond you’re the best ! Martin C. you are great too for -parent tricks ;)
You Sir are a legend :)
thanks man
$ wget –no-parent –wait=10 –recursive –accept=zip –no-directories yoursite.com
–wait can be replaced with 1 (second) providing the server from you will download don’t kick your ass out.
Remove –no-directories to completely crawl and download everything matching your criteria (zip files here) starting from the root directory.
thank you sir its work thank you so much
good
Thanks contributor. This is very much appreciated.
thanks man, very helpful
Very helpful post
Your forum is helping me a lot..
Great!!
thanks mate you helped me a lot
Thank you Ray – very useful bits of kit.
cool! thanks ray!
thanks Raymond always good to get the latest news from you
cool. thanks. Raymond rocks!
Thanks bro, love the forum and your article
thanks ray.