There are times when you will end on a web page that looks like a folder and you can only find files being listed. This is because the webserver directory index file (index.html, default.asp and etc) is not present on the folder and the directory listings option in the webserver is turned on. One of the reason to offer directory listings is to provide a convenient way for the visitor to quickly browse the files in the folders and allow them to easily download the files to their computer. Sometimes directory listings are accidental due to careless webmasters who forgot to include a .htaccess or blank index file to prevent all the files from being listed.
To download the file, you can either click on it or right click and select “Save link as” in Firefox/Chrome or “Save target as” in Internet Explorer. However, if you need to download multiple or even all of the files from the directory including the subfolders automatically, you will need third party tools to help you achieve that. Here are 4 different methods that you can use download all files from a folder on a website.1. Download Managers
If you are a frequent downloader, you probably already have a download manager program installed. It can be an add-on such as the popular FlashGot and DownThemAll! for Firefox or an independent software such as Internet Download Manager (IDM) and Free Download Manager (FDM). To download all of the files in a web directory with the Firefox download manager extensions, right click at an empty space on the page, and select DownThemAll! or FlashGot All from the context menu.
The download manager will then list all the files that it manages to find and lets you pick the ones that you want to download to your computer.
Do take note that FlashGot and DownThemAll! can only download all files on a directory but cannot download folders recursively. As for Internet Download Manager, select “Download all links with IDM” from the right click context menu, check the files that you want to download from the list and click OK.
Wget is a free and very powerful file downloader that comes with a lot of useful features including resume support, recursive download, FTP/HTTPS support, and etc. In “The Social Network” movie, Mark Zuckerberg is seen using the Wget tool to download all the student photos from his university to create Facemash. Wget is a command line tool which can be a bit difficult to use for some basic users.
Thankfully there are free front-end GUI’s for Wget such as VisualWget that makes it so much easier to use Wget by simply clicking on the check boxes rather than manually typing the command line arguments. Download VisualWget, extract and run VisualWget.exe. Click on the New icon to open a New Download window. Enter the URL that you want to download and select the location that you want to save the files to.
If you need to download multiple folders including subfolders, go to Advanced, click on Recursive Retrieval and tick on the first checkbox “–recursive“. Finally click OK to start downloading.
Although there are a few GUIs for Wget, we recommend VisualWget because it is free, portable, comes together with Wget and there is no need to download separately and manually unpacking it to the program’s folder or even configuring the path.
To download more than just the visible files or use FTP, try the methods on Page 2.
3. Offline Browsers
Offline Browsers are actually tools that download the whole website for offline viewing. These pieces of software are capable of crawling into subfolders, downloading all or filtered files and then convert the live hyperlinks to offline version pointing to the downloaded HTML file that is on your hard drive. Some of the popular offline browsers are Offline Explorer and Teleport but they are shareware that comes with powerful parsing capabilities.
HTTrack is an excellent free alternative to the paid offline browsers. You can download the portable version of HTTrack, extract and run WinHTTrack.exe. Click Next, give the project a name and click Next.
Now enter or paste the URL(s) that you want to download to the Web Addresses box and click Next. If the URL requires authentication, click the Add URL button and you get to enter the login and password for the URL. Click Next and Finish. HTTrack will now start crawling the given URL and download files that it finds.
4. Downloading from FTP
The best program to download all files and subfolders from FTP server is obviously a FTP client software such as FileZilla Client because they understand FTP commands and are able to crawl recursively into subfolders without problems. In the FileZilla Client, all you need to do is enter the FTP address on the Host box, enter username and password if it requires authentication or leave it blank if not, and click the Quickconnect button.
Once you’re logged in, simply right click at the folder at the right pane and select Download which will start downloading all files and folders from the selected directory.
Please take note that FTP clients can only handle FTP protocol and they cannot download files from HTTP web pages.