What is wget command and how to use it?

Joined
Nov 9, 2013
Messages
48
Best answers
0
Ratings
1
Points
8
#1
Can you guys tell me What is wget command and how to use it in Linux? I read some information about this command but I don't really know when should I use it and what are the uses of this command? Any guide?
 

BillEssley

Well-known member
Joined
Feb 19, 2013
Messages
299
Best answers
2
Ratings
25
Points
28
#2
Best answer
In the UNIX/Linux environment, you can quickly move to directories using cd command in Terminal of window. If you ever want to get a file from internet and save it in the current directory, it will take a while for you to use the web browser to download the file and select the directory to store that file. With the wget tool available in UNIX/Linux, you can easily download file to current directory quickly.

1. Syntax

Wget is a command-line utility, which is used for downloading files and content on the internet, which can be a website or an FTP site. Wget is very flexible and has many options to use for many different purposes.

The general syntax for wget is:
For example:
Code:
[[email protected] ~]# wget forumweb.hosting
Code:
--2018-04-18 17:10:46--  http://forumweb.hosting/
Resolving forumweb.hosting... 104.18.49.112, 104.18.48.112, 2400:cb00:2048:1::6812:3170, ...
Connecting to forumweb.hosting|104.18.49.112|:80... connected.
HTTP request sent, awaiting response... 302 Found
Location: https://forumweb.hosting/ [following]
--2018-04-18 17:10:47--  https://forumweb.hosting/
Connecting to forumweb.hosting|104.18.49.112|:443... connected.
HTTP request sent, awaiting response... 200 OK
Length: unspecified [text/html]
Saving to: “index.html”

    [    <=>                                ] 161,895      223K/s   in 0.7s

2018-04-18 17:10:49 (223 KB/s) - “index.html” saved [161895]

[[email protected] ~]#
You can specify multiple paths for wget as follows:
Code:
[[email protected] ~]# wget URL1 URL2 URL3
2. Save file with another name

Typically, downloaded files have the same name as file in URL, and information about the loading process is displayed on the screen.

You can save the downloaded file with a different name than the existing one, using the -O option. If the file with the specified name already exists, then the contents of the downloaded file overwrite the existing one.

Instead of displaying the information of the download process, you can save this information in a file, using the -o option.
Code:
[[email protected] ~]# wget ftp://example_domain.com/somefile.img -O dloaded_file.img -o log
By using this command, no information is printed on the screen. The log or process will be written to the log file, and the downloaded file will be named dloaded_file.img.

3. Automatically reload when corrupted

During the download, if the connection is unstable, the file download may be interrupted and failed. In these cases, you usually re-execute the load. However, instead of having to manually reload the download, wget provides us with an option to reload the download automatically every time the download is lost.

To do this, you use the -t argument in wget as follows:
Code:
[[email protected] ~]# wget -t 5 URL
In the above command, 5 is the number of times that wget will attempt to reload the file when the connection is lost during the download, instead of 5 times the number that you want wget to perform.

If we do not want to specify the number of reloads and want wget to repeat the load until the new stop, in this case you do as follows:
[[email protected]erver ~]# wget -t URL

4. Limit download speed

When you have a limited internet bandwidth and many applications share this connection, and if you need to load a large file, it will take up the bandwidth of other applications, making these applications inactive.

To limit download speed in wget, use the -limit-rate parameter as follows:
Code:
[[email protected] ~]# wget --limit-rate 20k http://example.com/file.iso
Explain:
k - Kilobyte (KB)
m - Megabyte (MB)

You can also specify the maximum load rating. The load will stop when the norm is reached.
To specify the load limit, use --quota or -Q as follows:
Code:
[[email protected] ~]# wget -q 100m http://example.com/file1 http://example.com/file2
5. Restart and continue loading

If the download is interrupted before it is complete, you can resume the download at the interrupt by using the -c option as follows:
Code:
[[email protected] ~]# wget -c URL
6. Copy whole site

Wget has 1 option to download an entire web page by recruiting all the URL links in web pages and downloading all of them. Therefore, you can download all web pages of a website.

To load web pages, use the -mirror option as follows:
Code:
[[email protected] ~]# wget --mirror --convert-links example.com
Or use the following command:
Code:
[[email protected] ~]# wget -r -N -k -l DEPTH URL
Explain:
-l => indicates the depth of the web pages as levels. This means that wget will only go through the number of levels that you specify.
DEPTH => depth of site.
-r (recursive) => recursively, shared with -l.
-N is used to activate lock time for files.
URL is the basic path for a website where the load should be initialized
-k or -convert-links => instructs wget to convert links to other pages in the page loaded to local copies of those pages.
In addition to loading a web page on the machine, you can use the lynx command as follows:
Code:
[[email protected] ~]# lynx -dump URL> webpage_as_text.txt
For example:
Code:
[[email protected] ~]# lynx -dump http://google.com> plain_text_page.txt
6. HTTP or FTP authentication

Some sites require authentication for HTTP or FTP links. To perform this authentication, use -user and -password arguments as follows:
Code:
[[email protected] ~]# wget --user u
Hope it helps!
 

24x7serverman

Well-known member
Joined
Jul 25, 2017
Messages
651
Best answers
1
Ratings
68 6
Points
28
#3
@BillEssley has already provided you the best answer.

Basically, it's used to download files from the internet using http, https, or FTP protocol.

Let's say you've to perform the migration from the source server to destination then you can create a backup file or zip/tar archive on the source server and using wget command download files on the destination server.

Hope it helps. :)
 
Older Threads
Newer Threads
Replies
6
Views
311
Replies
6
Views
489
Replies
8
Views
151
Similar Threads

Latest Hosting OffersNew Reviews

Sponsors

Latest Blog ArticlesMost Viewed Threads

Tag Cloud

You are using an out of date browser. It may not display this or other websites correctly.
You should upgrade or use an alternative browser.

Top