Linux find file older than

Linux: using find to locate files older than

find has good support for finding files the more modified less than X days ago, but how can I use find to locate all files modified before a certain date? I can’t find anything in the find man page to do this, only to compare against another files time or to check for differences between created time and now. Is making a file with the desired time and comparing against that the only way to do this?

The command is for part of a backup script, which grabs everything from /etc that was changed post-installation in our nightly backups.

Nine years old and I just noticed when moderating a new answer: the title and the body of this question do not state the same. The title asks for ‘files older than ‘ but the body states ‘modified after a certain date’. I interpret ‘after’ as newer than a specific date, not older.

10 Answers 10

If you have only ‘-newer file’ then you can use this workaround:

# create 'some_file' having a creation date of 16 Mar 2010: touch -t 201003160120 some_file # find all files created after this date find . -newer some_file 
 -t STAMP use [[CC]YY]MMDDhhmm[.ss] instead of current time 

Assuming that your touch has this option (mine is touch 5.97).

If you’re searching for -older counterpart, which does not exist: Just negate the expression with find . ! -newer some_file ! -name some_file . Second condition is necessary if you really want older files than the specified some_file.

No, you can use a date/time string.

-newerXY reference
Compares the timestamp of the current file with reference. The reference argument is normally the name of a file (and one of its timestamps is used for the comparison) but it may also be a string describing an absolute time. X and Y are placeholders for other letters, and these letters select which time belonging to how reference is used for the comparison.

 a The access time of the file reference B The birth time of the file reference c The inode status change time of reference m The modification time of the file reference t reference is interpreted directly as a time 
find -newermt "mar 03, 2010" -ls find -newermt yesterday -ls find -newermt "mar 03, 2010 09:00" -not -newermt "mar 11, 2010" -ls 

Источник

find files older than X days in bash and delete

I have a directory with a few TB of files. I’d like to delete every file in it that is older than 14 days. I thought I would use find . -mtime +13 -delete . To make sure the command works as expected I ran find . -mtime +13 -exec /bin/ls -lh ‘<>‘ \; | grep ‘‘ . The latter should return nothing, since files that were created/modified today should not be found by find using -mtime +13 . To my surprise, however, find just spew out a list of all the files modified/created today!

Читайте также:  Сетевая карта usb linux

See -daystart option for find. Your find counts exactly 24*13 hours backwards, leaving files which might be 24*13 — 1 minute and later your another find will find those.

Just figured it out! The reason is ls . find finds a directory with mtime +13 and ls simply list all it’s content no matter what mtime the files have (facepalm!).

Always test your find command first by replacing «-delete» with «-print». It may also include the current directory (.) in the result list, which may or may not be what you want.

3 Answers 3

find your/folder -type f -mtime +13 -exec rm <> \; 

Doesn’t work for filenames containing spaces. Either (GNU specific) find -delete or find -print0 | xargs -0 rm

@grebneke: can you back up your statement with examples or facts? find ‘s <> is well-known to be safe regarding spaces and funny symbols in file names.

$ find ./folder_name/* -type f -mtime +13 -print | xargs rm -rf 

The -r switch is useless. Moreover, you’ll run into problems if you have filenames containing spaces or other funny symbols. Use -print0 and xargs -0 . if your utilities support them, otherwise, use @Mindx’s answer. Or, if your find supports it, use the -delete test of find as so: find ./folder_name -type f -mtime +13 -delete .

While this code snippet may solve the question, including an explanation really helps to improve the quality of your post. Remember that you are answering the question for readers in the future, and those people might not know the reasons for your code suggestion.

using asterisk find ./folder_name/* is mistake, because shell will expand ‘‘ info all items existent in this folder. When folder keeps extremally huge items (files or directories), it will exceed maximum arguments or max characters for execute command. Better remove ‘‘ signt, this will do the same, without any automated shel expand. but if you need to find specified names, use option -name ‘some*thing’ and give expansion directly to find internals.

Источник

Find files older than X days excluding some other files

i’m trying to write a shell script, for linux and solaris, that finds some specific files older than X days and then deletes them. the trick is that during this process there are a couple of files that must not be deleted. for example from the following list of files i need to delete *.zip and keep *.log and *.something.*
1.zip
2.zip
3.log
prefix.something.suffix finding the files and feeding them to rm was easy, but i’m having difficulties in excluding the files from the deletion list.

@Grove, I would create a script which I would feed the result of find, then in that script I would contain a list of files to be excluded. Very simple, something like, if not in list; then rm -rf $ else . fi

the problem is that the files to be excluded are not the same every day. to be more specific they’re also logfiles generated by certain systems

Читайте также:  Aomei partition assistant linux

@Grove, I still don’t see what the problem is, you want to exclude files that end with .log and .something (whatever this might be), use find and locate those files. Save them to an array and then do a find of all files, compare the files in the array with the files find locate (when searching for all files)

you’re right, i was missing your point:) however it seems that i can do something more elegant from within find itself: find -L path -type f \( -name ‘.log’ \) -a ! \( -name ‘.zip’ -o -name ‘something‘ \) -mtime +3. thanks nonetheless

3 Answers 3

experimenting around i discovered one can benefit from multiple complex expressions grouped with logical operators like this:

find -L path -type f \( -name '*.log' \) -a ! \( -name '*.zip' -o -name '*something*' \) -mtime +3 
find /appl/ftp -type f -mtime +30 |grep -vf [exclude_file] | xargs rm -rf; 

find /appl/ftp -type f -mtime +30 |grep -v [exclude_file] | xargs rm -rf; no need to add «f «with grep

I needed to find a way to provide a hard coded list of exclude files to not remove, but remove everything else that was older than 30 days. Here is a little script to perform a remove of all files older that 30 days, except files that are listed in the [exclude_file].

EXCL_FILES=`/bin/cat [exclude_file]`; RM_FILE=`/usr/bin/find [path] -type f -mtime +30`; for I in $RM_FILES; do for J in $EXCL_FILES; do grep $J $I; if [[ $? == 0 ]]; then /bin/rm $I; if [[ $? != 0 ]]; then echo "PROBLEM: Could not remove $I"; exit 1; fi; fi; done; done; 

Источник

How to find and delete files older than specific days in unix?

I have got one folder for log with 7 folders in it. Those seven folders too have subfolders in them and those subfolders have subfolders too. I want to delete all the files older than 15 days in all folders including subfolders without touching folder structrure, that means only files.

mahesh@inl00720:/var/dtpdev/tmp/ > ls A1 A2 A3 A4 A5 A6 A7 mahesh@inl00720:/var/dtpdev/tmp/A1/ > ls B1 B2 B3 B4 file1.txt file2.csv 

5 Answers 5

You could start by saying find /var/dtpdev/tmp/ -type f -mtime +15 . This will find all files older than 15 days and print their names. Optionally, you can specify -print at the end of the command, but that is the default action. It is advisable to run the above command first, to see what files are selected.

After you verify that the find command is listing the files that you want to delete (and no others), you can add an «action» to delete the files. The typical actions to do this are:

    -exec rm -f <> \; (or, equivalently, -exec rm -f <> ‘;’ )
    This will run rm -f on each file; e.g.,

rm -f /var/dtpdev/tmp/A1/B1; rm -f /var/dtpdev/tmp/A1/B2; rm -f /var/dtpdev/tmp/A1/B3; … 
rm -f /var/dtpdev/tmp/A1/B1 /var/dtpdev/tmp/A1/B2 /var/dtpdev/tmp/A1/B3 … 

So, if you use option 2, the whole command would be:

find /var/dtpdev/tmp/ -type f -mtime +15 -exec rm -f <> + 

Источник

How to find and delete files older than x minutes in Linux/Unix

Carlos Delgado

Learn how to search for files whose last modification time is older than n minutes using bash in linux.

Читайте также:  Install teamviewer linux kali linux

For the latest project i’ve been working on (removeimagebg.io) I had to implement a feature where basically, all of the uploaded files in the server should be deleted where from the date and time of upload, there are 10 minutes difference to the current date and time. This happens basically because we can’t afford too much storage and it could become a trouble sooner or later due to the privacy and all the related stuff, so the solution is basically to program a cronjob that runs every 15 minutes, searches the files where 10 minutes have passed since the upload time and delete them, that simple.

In this article, I will share with you a very simple command to search for files older than x minutes from the current time and delete them using bash.

Finding the files

Instead of working with the programming language of the project, I decided to simply write a very simple script that does the mentioned algorithm, so I had to research as first, how to find the files that are older than 10 minutes from the current time. To do this, I simply used the find command:

find ./your-directory -daystart -maxdepth 1 -mmin +10 -type f

The command is described as follows:

  • ./your-directory : the absolute or relative directory where the search should be executed. Personally I recommend to use absolute paths, it will prevent possible headaches in the future.
  • -daystart : measure times from the beginning of today.
  • -maxdepth 1 : limiting search to the specific directory given as first argument. You can remove this if you have subdirectories where the search should be executed as well.
  • -mmin + : The mmin option is used to find files/directories with last modified in minutes (replace minutes with the amount of minutes as an integer). In our case, we want to search files older than 10 minutes. If you need to search for files older than 20 minutes, you would simply use -mmin +20 .
  • -type f : Limit search results to files.

It’s safe to run the previous command as it will only display the list of files that match the search, they will not be deleted or modified, for example a possible output would be:

./your-directory/file1.xd ./your-directory/file2.xd ./your-directory/file87.xd ./your-directory/file12.xd

Filtering files

If you need to filter the files by extension, filename or something like that, don’t forget that you can add a filter making use of the -iname parameter and use the asterisk as a wildcard placeholder:

# Search for jpeg files older than 10 minutes find ./your-directory -daystart -maxdepth 1 -mmin +10 -type f -iname "*.jpeg"

Deleting the files

Now as I mentioned, the last step of my script was to simply remove the files that match the search. Fortunately, the most problematic part was done. To remove the result of your search, all you need to do is to add the -delete argument to the search command:

# WARNING: THIS WILL DELETE ALL THE FILES FROM THE GIVEN DIRECTORY # WHOSE MODIFICATION DATE IS OLDER THAN 10 MINUTES FROM THE CURRENT TIME find ./your-directory -daystart -maxdepth 1 -mmin +10 -type f -delete

Источник

Оцените статью
Adblock
detector