+6 votes
ago in Software by (1.5m points)

How to download files from the Linux console

1 Answer

+7 votes
ago by (673k points)
 
Best answer

Downloading files from the Internet is a necessity that sooner or later always appears either in the daily use of our Linux desktop or in server administration tasks, so I write this article about how to download files from the Linux console .

image

We will not always have a graphical interface at our disposal, especially when we manage servers, hence in this article we will only talk about how to download from the console .

To download files we will need some type of HTTP client or program that sends the appropriate HTTP requests to start the downloads. In Linux we have several programs of this type but I will only talk about the most common ones that will be Wget, cURL and Axel. The first two are installed by default in the vast majority of Linux distributions and are the best known but I decided to include Axel because despite not being so well known it has a number of features that in certain situations can help us speed up downloads .

We will also make a brief comparison when downloading large files to see which has given us faster download speeds.

Programs to download files from the Linux console

As we said we are going to talk basically about three programs (Wget, cURL and Axel) to download from the Linux terminal and I will briefly comment on the options they offer to be able to choose the one that suits us.

GNU Wget

It is known among users simply as Wget and is a tool that works from our Linux terminal (Windows version is also available) and allows us to download any type of file. As main features we have:

  • It supports several protocols: HTTP, HTTPS and FTP.
  • It allows summarizing interrupted downloads.
  • It supports HTTP cookies.
  • It supports the use of HTTP proxies.
  • Included by default in virtually all Linux distributions.
  • It is free and open source.

Its use is quite simple and we have a complete manual in English that facilitates its handling. A basic example of a command to download a file with Wget is:
wget https://www.vozidea.com/archivo.zip

cURL

The most complete terminal tool to work with URLs and HTTP requests since it is not only focused on downloading but also with cURL you can perform other tasks such as sending emails, handling Telnet connections, etc ... It has a very extensive functionality but attending to the features that interest us when downloading files we would have:

  • It supports a multitude of protocols: HTTP, HTTPS, FTP, IMAP, IMAPS, LDAP, LDAPS, POP3, POP3S, RTMP, RTSP, SCP, SFTP, SMTP, SMTPS, Telnet and TFTP.
  • It supports the use of HTTP cookies.
  • It supports the use of HTTP proxies and SOCKS proxies.
  • Support to summarize interrupted downloads.
  • Included by default in many Linux distributions.
  • It is completely free and open source.

Having such a wide functionality its use is complicated a little more than Wget's but not much, we also have an English manual full of examples . The most basic use of cURL to download a file is:
curl https://www.vozidea.com/archivo.zip

Axel

As it involves downloading files from the console I have decided to include this application that will help us download large files at higher speeds in a simple way .

First of all I must explain one of the options we have when downloading files using the HTTP protocol and that is to download using segments or byte ranges , that is, the same file can be divided into several segments and download each segment individually and thus we increase the download speed . The download by segments can be done if we know the HTTP protocol well and the headers that we have to send so with both Wget and with cURL you could download files by segments. In the case of Wget, you have to go directly to the HTTP headers while cURL offers the --range option to define a range more easily.

The problem with using cURL and Wget is that we cannot use the segment download function in a simple way by executing a simple command and it is here that Axel comes into play that allows us to download in segments with a simple command. By default Axel divides the file to download into 4 segments, but this can be adjusted to our liking with the option -nx or --num-connections=x where x is the number of segments.

The simplest example of use is:
axel https://www.vozidea.com/archivo.zip

Installation on Debian-based distributions such as Ubuntu, Mint and the like is very simple:
sudo apt-get install axel

Axel does not have a good online documentation page but with the help command man axel is enough to get an idea of "‹"‹how it works.

Speed "‹"‹comparison when downloading files

We have done several tests with the three programs and we have been able to observe that Wget and cURL behave in a similar way, although cURL has almost always given us a little more speed but no big differences.

Axel is the fastest when it comes to downloading files from servers that limit the speed of each connection, reducing the download time even less than half but it must also be said that on servers that do not have that limitation the results were similar to those of cURL and Wget.


...