Sponsored Content
Top Forums Shell Programming and Scripting how to find first files in a directory and combine them as a single file? Post 302663525 by max_hammer on Thursday 28th of June 2012 08:04:55 AM
Old 06-28-2012
try this

Code:
  
ls -ltr /root/admin/files/file* | head -1 | awk -F "/"  '{print $NF}'

here i have set the field seprator as "/" awk -F "/" and $NF means last field

Last edited by max_hammer; 06-28-2012 at 09:15 AM..
This User Gave Thanks to max_hammer For This Post:
 

9 More Discussions You Might Find Interesting

1. Solaris

Find files older than x days and create a consolidated single tar file.

Hello, I need help in finding files older than x days and creating a single consolidated tar file combining them. Can anyone please provide me a script? Thanks, Dawn (3 Replies)
Discussion started by: Dawn Bosch
3 Replies

2. Shell Programming and Scripting

Combine in single file

I have 2 files: phone.txt and mobile.txt ex. phone.txt MOBILENO|DISABLE_DATE 919687877754|9/1/2011| 919687877762|9/1/2011| 919687880573|9/1/2011| 919687882658|9/2/2011| Ex. mobile.txt MOBILENO |TIME 919687880573|2011-09-17 12:23:40| 919687882658|2011-10-10 21:15:33|... (4 Replies)
Discussion started by: khingx
4 Replies

3. Shell Programming and Scripting

How to find empty files in a directory and write their file names in a text?

I need to find empty files in a directory and write them into a text file. Directory will contain old files as well, i need to get the empty files for the last one hour only. (1 Reply)
Discussion started by: vel4ever
1 Replies

4. Shell Programming and Scripting

wanted to find both link file and ordinary file using single find command

find . -type fl o/p is only the ordinary file. where in it wont give the link files. (2 Replies)
Discussion started by: nikhil jain
2 Replies

5. Shell Programming and Scripting

Find all the files under a specific directory and zip them into a single file.

Hi, Is there a way to find all the files from a specific location and then zip them into a single file, even if they are in multiple directories? (3 Replies)
Discussion started by: rudoraj
3 Replies

6. Shell Programming and Scripting

Combine Multiple Files into Single One File One after other

I am trying to combine 4 .dat files into one single Output file Inputs are:- file123.dat, file256.dat, file378.dat & file490 Expected Output:- FileName=file1 {text from file1} EOF {blank line} FileName=file2 {text from file2} EOF {blank line} FileName=file3 {text from file3} EOF... (4 Replies)
Discussion started by: lancesunny
4 Replies

7. Shell Programming and Scripting

Change to directory and search some file in that directory in single command

I am trying to do the following task : export ENV=aaa export ENV_PATH=$(cd /apps | ls | grep $ENV) However, it's not working. What's the way to change to directory and search some file in that directory in single command Please help. (2 Replies)
Discussion started by: saurau
2 Replies

8. Shell Programming and Scripting

Search Pattern and combine into single file

Hi Experts Please help me out with the following thing: 2 files and want the output file: {No for using FOR loop because I got 22 million lines} Tried that "It processes only 8000 records per hour" I need a faster way out !!! FileA: 9051 9052 9053 9054 9055 9056 9057 9058 9059 ... (5 Replies)
Discussion started by: navkanwal
5 Replies

9. Shell Programming and Scripting

Script to find directory is getting files in every 10 mins, if not then when last time file received

Dears, I am looking for a script which will work as a watch directory. I ha directory which keep getting files in every 10 mins and some time delay. I want to monitor if the directory getting the files in every 10 mins if not captured the last received file time and calculate the delay. ... (6 Replies)
Discussion started by: sadique.manzar
6 Replies
bup-margin(1)						      General Commands Manual						     bup-margin(1)

NAME
bup-margin - figure out your deduplication safety margin SYNOPSIS
bup margin [options...] DESCRIPTION
bup margin iterates through all objects in your bup repository, calculating the largest number of prefix bits shared between any two entries. This number, n, identifies the longest subset of SHA-1 you could use and still encounter a collision between your object ids. For example, one system that was tested had a collection of 11 million objects (70 GB), and bup margin returned 45. That means a 46-bit hash would be sufficient to avoid all collisions among that set of objects; each object in that repository could be uniquely identified by its first 46 bits. The number of bits needed seems to increase by about 1 or 2 for every doubling of the number of objects. Since SHA-1 hashes have 160 bits, that leaves 115 bits of margin. Of course, because SHA-1 hashes are essentially random, it's theoretically possible to use many more bits with far fewer objects. If you're paranoid about the possibility of SHA-1 collisions, you can monitor your repository by running bup margin occasionally to see if you're getting dangerously close to 160 bits. OPTIONS
--predict Guess the offset into each index file where a particular object will appear, and report the maximum deviation of the correct answer from the guess. This is potentially useful for tuning an interpolation search algorithm. --ignore-midx don't use .midx files, use only .idx files. This is only really useful when used with --predict. EXAMPLE
$ bup margin Reading indexes: 100.00% (1612581/1612581), done. 40 40 matching prefix bits 1.94 bits per doubling 120 bits (61.86 doublings) remaining 4.19338e+18 times larger is possible Everyone on earth could have 625878182 data sets like yours, all in one repository, and we would expect 1 object collision. $ bup margin --predict PackIdxList: using 1 index. Reading indexes: 100.00% (1612581/1612581), done. 915 of 1612581 (0.057%) SEE ALSO
bup-midx(1), bup-save(1) BUP
Part of the bup(1) suite. AUTHORS
Avery Pennarun <apenwarr@gmail.com>. Bup unknown- bup-margin(1)
All times are GMT -4. The time now is 07:03 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy