My Internet service limits data. Between midnight and 7 am, I have twice as much available as during the rest of the day. Naturally enough, I'd like to make use of that data. I'd also like to get some sleep.
I'd rather not leave my desktop running overnight. An old Raspberry Pi
Model B should be able to do the job. For this project, the Pi is running
Raspbian Buster Lite.
I couldn't find a download scheduler with a web interface that works in a way that I'm comfortable with. There seems to be more interest in making downloads faster than in making them run unattended on a schedule. My fall-back is to schedule a command-line download accelerator.
First, the scheduler. File downloads are one-off tasks. The "at" command will do the job.
Install "at" and set daemon to autostart:
sudo apt-get install at
sudo service atd start
sudo update-rc.d atd defaults
To subsequently disable and re-enable autostart, use:
sudo update-rc.d atd disable|enable
update-rc.d has a 'remove' option, but that's only used after a program has been uninstalled and all scripts deleted.
More on update-rc.d here.
To check what services are running:
sudo service --status-all
More on reading service staus here.
The "wget" command is installed by default, but it has some nasty habits. I prefer "axel".
Install "axel" download accelerator:
sudo apt-get install axel
By default, "at" mails output to the user's local account. Check mail regularly (use the "mail" command, then "d" to delete messages and "q" to exit), to prevent a buildup.
Create a directory for downloads (I named mine "downloads") :).
Sample command sequence (modify according to your download time, directory and the file to be downloaded):
sudo at 4 AM tomorrow
Response:
warning: commands will be executed using /bin/sh
at>
At the at> prompt:
cd /home/pi/downloads
then
axel https://report.ipcc.ch/sr15/pdf/sr15_spm_final.pdf
Finish with Control-D.
The above is an example only. Replace https: and the rest of that line with the URL of the file you want to download. Finding the URL can be difficult. Some sites lead you through pages of advertising or begging for donations before getting to the actual file. I use Free Download Manager (set to manual download), a browser extension that traps download requests and records them for future download. I copy the URL from FDM, then delete the download from the list.
It actually gets quite easy after you've done it a few times.
The go-to client for Linux is Transmission. It's light on resources and has a handy web interface, so it's easy to access from machines on the same network. It does take a bit of setting up though. I hope I've remembered everything.
Install Transmission daemon:
sudo apt-get install transmission-daemon
Add user to transmission group (I'm using the default user, "pi"):
sudo usermod -a -G debian-transmission pi
From the user's home directory (again, I'm using "pi"):
mkdir transmission
cd transmission
mkdir complete
mkdir incomplete
Change ownership of newly created directories:
sudo chown debian-transmission:debian-transmission /home/pi/transmission/
sudo chown debian-transmission:debian-transmission /home/pi/transmission/incomplete
sudo chown debian-transmission:debian-transmission /home/pi/transmission/complete
or, if you're feeling brave:
sudo chown -R debian-transmission:debian-transmission /home/pi/transmission/
Even when upload and download speeds are set to zero, Transmission seems
to use data. I therefore stop the daemon when not in use. Starting and
stopping during off-peak hours is handled by cron.
Outside those hours, to start Transmission:
sudo transtart.sh
and to stop it:
sudo transtop.sh
Copy shell scripts to /usr/bin/. For that, I used FileZilla.
Make sure scripts are owned by root and executable:
sudo chown root: /usr/bin/transtart.sh
sudo chown root: /usr/bin/transtop.sh
sudo chmod +x /usr/bin/transtart.sh
sudo chmod +x /usr/bin/transtop.sh
Modern versions of Transmission seem to like a fair lot of memory.
service transmission-daemon status
will sometimes show something like:
UDP Failed to set receive buffer:
requested 4194304, got 327680 (tr-udp.c:84)
and:
UDP Failed to set send buffer:
requested 1048576, got 327680 (tr-udp.c:95)
Transmission should still work, but adding these lines to /etc/sysctl.conf or to a file with the .conf extension under /etc/sysctl.d/ seems to make it faster:
net.core.rmem_max = 16777216
net.core.wmem_max = 4194304
At the very least, it should stop the messages in the status log.
Some explanation here.
By default, Transmission is set up to start on bootup. To prevent that,
first make sure the daemon isn't running, then:
sudo update-rc.d transmission-daemon
disable
Edit cron to start and stop Transmission during off-peak hours:
sudo crontab -e
The first time that command is run, it will ask which editor to use. The
easiest is "nano". Scroll down to the end of the file. This is what I
added:
30 01 * * * /usr/bin/transtart.sh
55 06 * * * /usr/bin/transtop.sh
Those lines will start Transmission at 1:30 am and stop it at 6:55, every morning. For different times, just remember that the first figure is the minutes and the second one is the hours. Press Control-O, to write the lines out, then Control-X, to exit the file.
Within Transmission's web interface ("Alternative Speed Limits" on the "Speed" tab, under the spanner icon at bottom left), I have speeds set to zero between 6:45 am and 2:00 am (more than 19 hours). That means that, when I add entries during peak hours, downloading doesn't begin immediately. The cron job starts Transmission, which comes out of zero speeds 30 minutes later, then goes back 10 minutes before cron stops the daemon. I could fine-tune it, but that works well enough.
Settings are held in /etc/transmission-daemon/settings.json. Contents of my file are listed here. Before running the daemon, you'll need to edit it.
The first seven lines, beginning "alt-speed-" relate to the speed tab
mentioned above.
The "download-dir" and "incomplete-dir" lines should be set to the
directories created above.
The "incomplete-dir-enabled" line should be set to "true". That segregates
fully-downloaded files from those that haven't been completely downloaded.
Setting "rpc-enabled" to "true" allows access to the web interface.
Set "rpc-password" to your plain-text rpc password. When first run, it
will be run through a hash function. The hash will then replace the
password in the file.
The "rpc-port" can be set to any valid port. Just try to avoid conflicts
with any other software running on the Pi.
The "rpc-url" line is the last element in the url to the web interface.
The "rpc-username" can be changed, but I prefer to leave it.
Next, the "rpc-whitelist", which lists IP addresses permitted to access
the web interface. Mine limits it to the Pi itself (127.0.0.1) and
machines on my home network (192.168.1.*).
To access the web interface, type:
[machine
address]:[rpc-port]/transmission/web/
into the address bar of a web browser. You should then be asked for your
RPC credentials.
To get it to work reliably,I found that I had to give my Pi a static
IP address.
Video streaming consumes a substantial part of my download data. Fortunately, many streams can be downloaded. Scheduling downloads during off-peak hours saves on peak data.
[Edit 18 February 2023]
The previously recommended tool youtube-dl is evidently no longer maintained. Its successor is yt-dlp. It's a command line tool, so it can be automated through at.
Video streaming sites don't particularly like people downloading streams, so there's a constant arms race between them and creators of downloading software. Sadly, the version of youtube-dl in the Raspbian depositories is ancient, which means it probably won't work. The best course is to install from the .org site. Installed this way, youtube-dl will not update with the rest of your system. Along with your regular system updates, run: sudo /usr/local/bin/yt-dlp -U for the necessary update
There are many ways to install
yt-dlp. I chose:
sudo wget
https://github.com/yt-dlp/yt-dlp/releases/latest/download/yt-dlp -O
/usr/local/bin/yt-dlp
sudo chmod a+rx /usr/local/bin/yt-dlp
[End edit]
yt-dlp calls ffmpeg for some operations:
sudo apt-get install ffmpeg
The method for finding video URLs varies between sites. If it's possible, then Google will generally point you in the right direction.
This is an example of a line from one of my at
jobs:
/usr/local/bin/yt-dlp -o "[output
file].mp4" --restrict-filenames -i -c -R 10
--skip-unavailable-fragments --fragment-retries 10 --all-subs -w
--recode-video mp4 --postprocessor-args "-codec copy" --keep-video
"https://[full URL]"
This work is licensed under a Creative Commons Attribution 4.0 International License. |
Feedback: This e-mail address is being protected from spam bots, you need JavaScript enabled to view it. |