This is a quick entry, I’m going to share the script that I have used to remove more of 10 millions of files in a linux system.
I had this problem, in a directory the amount of files is very excesive that I can’t list, count or even get the size of the entire directory, and that directory is used for some processes on every execution it is impossible to get the infomation from there.
In stackexchage I have found some advice to do this but nothing works for me, I used the rsync command, the find command with a recursve actions, but doesn’t work, the server takes a lot of time to list files or an error is showed.
So, I had to create this script to delete by a set number of files (every 10000 files):
#!/bin/bash
TARGET_PATH=/path/to/dir
while [ "$(ls -U $TARGET_PATH | head -1)" ]
do
for i in "$(ls -U $TARGET_PATH | head -10000)"
do
cd $TARGET_PATH
rm -r $i
done
echo "executed deletion on:"+$(date)
done
References