Useful Wget commands
Collection of wget commands
wget [URL]
: At its simplest,wget
followed by the URL downloads the file to your current directory, preserving the original filename.-O, --output-document=[FILE]
: This command lets you specify a custom filename for the downloaded file instead of using the one from the URL.
-b, --background
: Use this to runwget
in the background, freeing your terminal for other tasks. It logs the progress intowget-log
or a specified logfile.
-r, --recursive
: This command is perfect for downloading entire websites or parts of websites, mirroring all the directory structure.-l, --level=[NUMBER]
: Limits the recursion depth. Useful for controlling how deep you go into a website’s link hierarchy.
--limit-rate=[RATE]
: Slows down the download speed to a specified rate, ensuring it doesn’t hog all your bandwidth.
-c, --continue
: Resumes a partially downloaded file, picking up where it left off without starting from scratch.
--start-time=[TIME]
: Schedules the download to begin at a specific time. This is useful for initiating downloads during off-peak hours.
-t, --tries=[NUMBER]
: Sets the number of retries for a failed download. Use-t 0
for infinite retries.--retry-connrefused
: Retries even if the connection is refused.
-P, --directory-prefix=[PREFIX]
: Directswget
to save the file(s) to a specified directory, organizing your downloads better.
--http-user=[USER]
and--http-password=[PASSWORD]
: These commands are critical for downloading files from password-protected websites, ensuring secure access.
--no-check-certificate
: Disables SSL certificate verification, useful for sites with self-signed certificates.
-U, --user-agent=[AGENT]
: Mimics a specific browser or bot, which can be necessary for downloading from sites that restrict by user agent.
-e, --execute="http_proxy=[URL]"
: Configureswget
to use a proxy for the download, essential for navigating through network restrictions.