- How to find a very latest created file in current directory on linux machine?
- 4 Answers 4
- NOTES
- How do I get the latest file from a list of files in a particular directory? [closed]
- 4 Answers 4
- Find the latest file by modified date
- 6 Answers 6
- Linux find command, find 10 latest files recursively regardless of time span
How to find a very latest created file in current directory on linux machine?
How do I find a the last created file in the current directory on a Linux machine? Note: I don’t know the mtime.
4 Answers 4
A solution which is safe for files with spaces within the filename. Strings are terminated with 0 with print0 .
$ touch "file with spaces" $ find . -maxdepth 1 -type f -print0 | xargs -0r ls -ltr | tail -1 -rw-rw-r-- 1 jris jris 0 jun 3 15:35 ./file with spaces
-p adds a trailing / to directories and then grep removes them.
This get the latest changed file. If it really needs to be creation time, in ext4 that is stored, see unix.stackexchange.com/a/50184/8250
Why are you even suggesting xargs ? The other answers, piping ls into head or tail , handle filenames with spaces, and your xargs solution doesn’t handle filenames with newlines.
Linux doesn’t store a timestamp for the birth of a file, but if no other files have been changed in the directory since its creation, you can sort the files by their modification time and return the first.
(1) You don’t have to say -1 ; ls automatically goes into one file per line mode when the output is redirected. (2) You do need to say -a , in case the file’s name begins with a period.
Thanks, changed. Although, be warned that -a might be .. or . which could be trouble for certain scripts.
@Pablo, @Vortico: Good point; the user probably would want to exclude . and .. . (The again, since the question says “last created file”, the right answer might be ls -atl | grep «^-» | head -1 .)
Strictly speaking, the creation time is stored by certain file systems but the kernel API does not provide a call for it.
If you are on an ext filesystem, you can use debugfs to get the creation date of an inode. So, you could collect the inodes for each file in the current directory and then sort by creation time:
#!/usr/bin/env bash ## This will hold the newest timestamp newestT=0 ## Get the partition we are running on fs=$(df --output=source "$@" | tail -1); ## Iterate through the files in the directory ## given as a target for file in "$@"/*; do ## Only process files if [ -f "$file" ]; then ## Get this file's inode inode=$(ls -i "$file" | awk 'NR==1'); ## Get its creation time crtime=$(sudo debugfs -R 'stat "'>' $fs 2>/dev/null | grep -oP 'crtime.*-- \K.*'); ## Convert it to a Unix timestamp timestamp=$(date -d "$crtime" +%s) ## Is this newer than the newest? if [[ $timestamp -gt $newestT ]]; then newestT=$timestamp; newest="$file"; fi fi done ## Print the newest file echo "$newest"
Save the script above as ~/bin/get_newest.sh , make it executable ( chmod 744 get_newest.sh ) and run like this:
~/bin/get_newest.sh /target/directory
NOTES
- Unlike the other answers, this one will actually return the newest file in terms of its creation date, not the one that was modified most recently.
- It will only work on ext4 (perhaps 3, not sure) filesystems.
- It can deal with any file names, spaces and newlines etc are not a problem.
How do I get the latest file from a list of files in a particular directory? [closed]
I need to get the latest file from a list of files in a particular directory. When I run the script the first time I’ve copied the list of files in a particular directory to another directory. From the second run I need to get the latest files from a list of files in same directory to copy those files into the other directory. Here in the first run I’m capturing the last file’s creation date into a variable, when the script runs a second time, files that are greater than the last file creation date in the first run; these files need to be copied to the other directory. Please could anyone help me out to get the latest files?
You should clarify your intent. If all you want to do is sync the two folders, then the steps you’re asking for don’t represent the best approach.
4 Answers 4
The first will list all files ordered by access time, the latter select the first line.
In zsh, the latest file in the current directory is
a=(*(om[1])) latest_file=$a[1] cp -p $latest_file /other/directory
(Glob qualifiers rock, but unfortunately the result is still a list even if you only request one element, hence the temporary array a .)
In other shells, if you know that your file names cannot contain newlines or unprintable characters, you can use ls :
latest_file=$(ls -t | head -n 1) cp -p "$latest_file" /other/directory
However, it seems that what you’re really asking for is the list of files that are newer than a particular file. For this, you can use the -cnewer builtin of find to detect files whose content or metadata has changed more recently than some other files. Note that the way I invoke it here, find also collects files in subdirectories.
find . -cnewer "$previous_latest_files" -exec sh -c 'cp -p "$@" "$0"' /other/directory <> +
The problem remains to detect the previous latest file… In fact, what matters is the date of the copy. So create a marker file when you do the copy.
find . -cnewer marker_file -exec sh -c 'cp -p "$@" "$0"' /other/directory <> + touch marker_file
If the files remain in the target directory, then you can simply synchronize the two directories. Note that if you delete some old files from the target directory but they remain in the source directory, they’ll be copied again with this method.
Find the latest file by modified date
If I want to find the latest file (mtime) in a (big) directory containing subdirectories, how would I do it? Lots of posts I’ve found suggest some variation of ls -lt | head (amusingly, many suggest ls -ltr | tail which is the same but less efficient) which is fine unless you have subdirectories (I do). Then again, you could
find . -type f -exec ls -lt \ \+ | head
which will definitely do the trick for as many files as can be specified by one command, i.e. if you have a big directory, -exec. \+ will issue separate commands; therefore each group will be sorted by ls within itself but not over the total set; the head will therefore pick up the lastest entry of the first batch. Any answers?
6 Answers 6
You do not need to recur to external commands (as ls ) because find can do all you need through the -printf action:
find /path -printf '%T+ %p\n' | sort -r | head
Yeah, I came up with find . -type f -exec stat —format=%y \ \+ | sort -r | head -n1 but your solution is far cleaner!\>
You can also cull the output of head to include a certain number of lines. I only needed the first line, so I used head -n 1
@qwr wrote «Append | cut -d ‘ ‘ -f2 to get filename only». Thanks! Although better append | cut -d ‘ ‘ -f2 — to avoid problems with filenames that contain spaces.
I had a similar problem today, but I attacked it without find . I needed something short I could run over ssh to return the most recently edited file in my home directory. This is roughly what I came up with:
The -p option to ls adds a trailing slash to directories, the grep -v removes lines ending in a slash (aka, all directories), and the head -1 limits the output to a single file.
This is much less verbose than using find if all you want to return is the file name.
This is on my system faster than printf , though I don’t understand why
find /path -type f -exec stat -c "%y %n" <> + | sort -r | head
EDIT: I guess this post is not ‘not particularly useful’ as I thought it was. This is a really fast solution that just keeps track of the most recently modified file (instead of sorting the entire list of files):
Spread over multiple lines for clarity it looks as follows:
find . -type f -printf '%T@ %p\n' | awk ' BEGIN < mostrecenttime = 0; mostrecentline = "nothing"; > < if ($1 >mostrecenttime) < mostrecenttime = $1; mostrecentline = $0; >> END < print mostrecentline; >' | cut -f2- -d ' '
Not a particularly useful post but since ‘arrange’ was discussing speed, I thought I’d share this.
arrange’s and enzotib’s solutions involve listing all files inside the directory with their mtimes and then sorting. As you know sorting is not necessary to find the maximum. Finding maximum can be done in linear time but sorting takes n log(n) time [I know the difference isn’t much, but still ;)]. I can’t think of a neat way of implementing this. [EDIT: A neat (albeit dirty looking) and fast implementation provided above.]
Next best thing — To find the most recently edited file in a directory, recursively find the most recently edited file in each level 1 subdirectory. Let this file represent the subdirectory. Now sort the level 1 files along with the representatives of the level 1 subdirectories. If the number of number of level 1 files and sub-dirs of each directory is nearly a constant, then this process should scale linearly with total number of files.
This is what I came up with to implement this:
findrecent() < < find "$1" -maxdepth 1 -type f -exec stat -c "%y %n" <>+ | sort -r | head -1 && find "$1" -mindepth 1 -maxdepth 1 -type d -exec findrecent <> \;; > | sort -r | head -1; > findrecent .
I ran this and got a bunch of find: findrecent: No such file or directory errors. Reason: -exec of find runs in a different shell. I tried defining findrecent in .bashrc, .xsessionrc but these didn’t help [I’d appreciate help here]. In the end I resorted to putting
#!/bin/bash < find "$1" -maxdepth 1 -type f -exec stat -c "%y %n" <>+ | sort -r | head -1 && find "$1" -mindepth 1 -maxdepth 1 -type d -exec findrecent <> \;; > | sort -r | head -1;
in a script called findrecent in my PATH and then running it.
I ran this, kept waiting and waiting with no output. Just to be sure I wasn’t dealing with any infinite loops I modified the file to
#!/bin/bash echo "$1" >&2 < find "$1" -maxdepth 1 -type f -exec stat -c "%y %n" <>+ | sort -r | head -1 && find "$1" -mindepth 1 -maxdepth 1 -type d -exec findrecent <> \;; > | sort -r | head -1;
and tried again. It did work — but took 1 minute 35 seconds on my homefolder — arrange’s and enzotib’s solutions took 1.69, 1.95 seconds respectively!
So much for O(n)’s superiority over O(n log (n))! Damn you function call overhead! [Or rather script call overhead]
But this script does scale better than the earlier solutions and I bet it’ll run faster than them on google’s memory bank ;D
Linux find command, find 10 latest files recursively regardless of time span
This output is OK, doesn’t work good if I put wider time span. (notice I use -ctime and not -mtime because some uploaded files are modified few years ago) Problem is that files can be uploaded once a month, or once in a year, and I still need to get 10 latest files, regardless of time span. If it can’t be done, does tail only limit output, or somehow just fetches number specified without huge performance impact on large number of files. By using command from one answer on SO, I was able to get the files but some files were missing.
find . -type f -printf '%T@ %p\n' | sort -n | tail -10 | cut -f2- -d" "
./Mobilni Telefoni/11. Samsung/1. FLASH FILES/1. SRPSKI HRVATSKI JEZICI/E/E2330/E2330_OXFKE2.rar ./Mobilni Telefoni/11. Samsung/1. FLASH FILES/1. SRPSKI HRVATSKI JEZICI/E/E2330/FlashTool_E2_R6.zip ./Mobilni Telefoni/11. Samsung/1. FLASH FILES/1. SRPSKI HRVATSKI JEZICI/E/E210/E210_XFGH2.rar ./Mobilni Telefoni/05. iPhone/07. iFaith/iFaith-v1.4.1_windows-final.zip ./Mobilni Telefoni/05. iPhone/09. iPhone Browser/SetupiPhoneBrowser.1.93.exe ./Mobilni Telefoni/05. iPhone/10. iPhone_PC_Suite/iPhone_PC_Suite_Eng_v0.2.1.rar ./Mobilni Telefoni/05. iPhone/10. iPhone_PC_Suite/iPhone_PC_Suite_Ok.rar ./test ./Mobilni Telefoni/11. Samsung/1. FLASH FILES/1. SRPSKI HRVATSKI JEZICI/E/E2152/E2152_XXJH4_OXFJI2.zip.filepart ./GPS Navigacije/01. Garmin/03. Garmin Other/test.txt
File garmin_kgen_15.exe is missing because it was created in 2008, but it was uploaded in last 24 hours.