What to do when your Linux web server says it's out of space--but it isn't (inodes and linux headers, oh my!)
Posted on January 18, 2017 | By Matt Stauffer
Warning: This post is over a year old. I don't always update old posts with new information, so some of this information may be out of date.
The story (skip if you're stressing)
The default configuration for many Linux server setups—including that for Laravel Forge-created servers—leaves a lot of old Linux headers sitting around every time your system downloads upgrades. Folders like
linux-headers-3.13.0-53-generic (3.13.0-53.89), just full of hundreds and thousands of files, slowly taking over your server.
Normally this is no problem. The files are tiny. The server I'm working on right now has 30GB of disk space and 20GB free.
But this morning I started getting a series of tweets about my site being down. Thankfully, this isn't the first time I've hit this error, so it was an easy fix. But still, these tweets are no fun:
@stauffermatt heads up! your blog fails to write cache file. Permissions or full disk issue, maybe?— Damiano Venturin (@damko) January 18, 2017
These and a dozen more. Ouch. But I'm not out of disk space. What's going on?
Turns out, there's something most folks never run into: your server doesn't just have limited space; it also has a limited number of "inodes", which are essentially the objects that represent a file or a directory. Most people never run into this limit, because it's an absurdly high number. But there's something fun about the Linux headers I mentioned before: while they're tiny, there are thousands and thousands and thousands of them.
Is this your problem?
Here's how you know this is your problem: you're constantly seeing errors on your server that the server is out of disk space and can't do simple things like tab-autocomplete your typing, but when you check, you have plenty of space:
$ df -h Filesystem Size Used Avail Use% Mounted on /dev/vda1 30G 3.4G 25G 12% /
What's your next step to verify this is really your problem? Check your inodes:
$ df -i Filesystem Inodes IUsed IFree IUse% Mounted on /dev/vda1 1966080 1966080 0 100% /
If you see 100%
IUse% (or close to it), then this is indeed your problem: You have too many files and folders (inodes) on your machine.
There are two ways to fix this. First, you could manually look through your whole server to try to find the offending directories and figure out where they're coming from, and then manage them. Here's a great article describing how to do that, and you should try this option if the second option doesn't work.
Second, you could take my word that if you're on my blog, there's a really good chance you're hitting this error because
apt-get doesn't auto-remove old, unused packages, and this issue is likely happening because of extra, unused Linux header packages.
If your system lets you run this command, you're good to go:
sudo apt-get autoremove -y
This will tell
apt-get to remove anything it's installed that's currently not in use. That means all those old Linux headers—and plenty of other no-longer-needed depedencies.
However, you might not be able to run this command—because
apt-get needs to be able to write to your filesystem in order to do its work. If you get an error about "not enough drive space" here, don't fret. It's still pretty likely your issue is with those Linux headers, so let's go find them.
Manually removing the Linux headers
$ uname -r 3.13.0-43-generic
The output of the
uname -r command shows which version of the Linux kernel you're currently running. Remember this, because you don't want to delete this one.
List out the files in
/usr/src, and find a good chunk of headers which aren't yours:
$ cd /usr/src $ ls -al drwxr-xr-x 24 root root 4096 Jan 11 06:32 linux-headers-3.13.0-107 drwxr-xr-x 7 root root 4096 Jan 11 06:32 linux-headers-3.13.0-107-generic ... drwxr-xr-x 24 root root 4096 Jan 11 06:32 linux-headers-3.13.0-43 drwxr-xr-x 7 root root 4096 Jan 11 06:32 linux-headers-3.13.0-43-generic
All we need to do right now is delete a good chunk of them so that we can let
apt-get handle the rest. Here's what I ran (be cautious; this is running
sudo rm -rf on system files. Screw this up and you tank your server.)
I noticed that I have a bunch of headers that begin with
linux-headers-3.13-0-9, so I'll delete all of them:
$ cd /usr/src $ sudo rm -rf linux-headers-3.13.0-9*
Good. We just dumped thousands and thousands of files, and we can now rely on
sudo apt-get autoremove -y to clean up the rest of the system for us. Boom.
How to make this not happen again
The simplest answer is "just run
sudo apt-get autoremove -y every once in a while".
You can try to automate it, but because it requires
sudo access, it's going to be tough and possibly dangerous. Here's one guy who tried.
Comments? I'm @stauffermatt on Twitter
Tags: linux • laravel forge • inodes • hosting • dev-ops