Sponsored Content
Top Forums Shell Programming and Scripting Copying files with a specific pattern Post 302902371 by RudiC on Tuesday 20th of May 2014 08:06:22 AM
Old 05-20-2014
Try to use [0-9]{8}
This User Gave Thanks to RudiC For This Post:
 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

Finding a specific pattern from thousands of files ????

Hi All, I want to find a specific pattern from approximately 400000 files on solaris platform. Its very heavy for me to grep that pattern to each file individually. Can anybody suggest me some way to search for specific pattern (alpha numeric) from these forty thousand files. Please note that... (6 Replies)
Discussion started by: aarora_98
6 Replies

2. UNIX for Dummies Questions & Answers

Copying specific files

I wanted to see if some one could confirm the proper command and format for copying specific files i.e., ones that contain certain character string in the file name. I would like to copy all files that contain a numeric sequence in the file name i.e., "922371". Files are compressed - *.gz. Would... (3 Replies)
Discussion started by: faaron3
3 Replies

3. Shell Programming and Scripting

Copying specific files from remote m/c to specific folders

Hi All, I am trying to rsync some of the latest files from remote m/c to my local linux box. Folder structure in my remote m/c looks like this /pub/Nightly/Package/ROLL/WIN /pub/Nightly/Package/SOLL/sol /pub/Nightly/Package/SOLL/linux Each of the folder contains gzip files which on daily... (0 Replies)
Discussion started by: jhoomsharabi
0 Replies

4. UNIX for Dummies Questions & Answers

copying a pattern of files in one directory into other with new pattern names...

Hi, I have to copy a set of files abc* in /path/ to /path1/ as abc*_bkp. The list of files appear as follows in /path/: abc1 xyszd abc2 re2345 abcx .. . abcxyz I have to copy them (abc* files only) into /path1/ as: abc1_bkp abc2_bkp abcx_bkp .. . (6 Replies)
Discussion started by: new_learner
6 Replies

5. Shell Programming and Scripting

Copying specific files from one dir to another

Hi Folks, I have one curious case. There are list of following files placed in one directory such as... And updated each month. files.JAN09.csv files.FEB09.csv files.MAR09.csv ..... Now, I need to move a specific files; i.e, For this month, I need to move only OCT09, NOV09, DEC09,... (1 Reply)
Discussion started by: Jerald Nathan
1 Replies

6. Shell Programming and Scripting

copying columns with headers' specific pattern

Hi friends, I have data in tab separated file with headers like this : *sml1 *sml3 *smln7 smfk9 smllf56... Which shell command I should use if i want to extract entire columns that have header names beginning with "*" ? i want to copy these columns into another file. Thanks, (14 Replies)
Discussion started by: jacks
14 Replies

7. Shell Programming and Scripting

list files with a specific pattern

How can I list files with the following specific criteria? I am trying this $> ls *.log or $>ls *.log? --> but it only gives me fsaffa.log1, rwerw.log2. How can I get all three files with a simple selection criteria ? (5 Replies)
Discussion started by: kchinnam
5 Replies

8. Shell Programming and Scripting

Copying files based on a pattern

Hi All, I need to find and list the last 5 days files in that exact name with "MIM" and copy to another directory. please help me in this there is around 30000 files Thanks Murali (7 Replies)
Discussion started by: 969murali@gmail
7 Replies

9. Shell Programming and Scripting

Bash: copying lines with specific character to files with same name as copied line.

I am trying to make my script as simple as a possible but, I am not sure if the way I am approaching is necessarily the most efficient or effective it can be. What I am mainly trying to fix is a for loop to remove a string from the specified files and within this loop I am trying to copy the lines... (2 Replies)
Discussion started by: Allie_gastrator
2 Replies

10. Shell Programming and Scripting

Copying specific file types to specific folders

I am trying to write a script that cycles through a folder containing many folders and when inside each one it's supposed to copy all the .fna.gz files to a folder elsewhere if the file and the respective folder have the same name. for fldr in /home/playground/genomes/* ; do find .... (8 Replies)
Discussion started by: Mr_Keystrokes
8 Replies
bup-margin(1)						      General Commands Manual						     bup-margin(1)

NAME
bup-margin - figure out your deduplication safety margin SYNOPSIS
bup margin [options...] DESCRIPTION
bup margin iterates through all objects in your bup repository, calculating the largest number of prefix bits shared between any two entries. This number, n, identifies the longest subset of SHA-1 you could use and still encounter a collision between your object ids. For example, one system that was tested had a collection of 11 million objects (70 GB), and bup margin returned 45. That means a 46-bit hash would be sufficient to avoid all collisions among that set of objects; each object in that repository could be uniquely identified by its first 46 bits. The number of bits needed seems to increase by about 1 or 2 for every doubling of the number of objects. Since SHA-1 hashes have 160 bits, that leaves 115 bits of margin. Of course, because SHA-1 hashes are essentially random, it's theoretically possible to use many more bits with far fewer objects. If you're paranoid about the possibility of SHA-1 collisions, you can monitor your repository by running bup margin occasionally to see if you're getting dangerously close to 160 bits. OPTIONS
--predict Guess the offset into each index file where a particular object will appear, and report the maximum deviation of the correct answer from the guess. This is potentially useful for tuning an interpolation search algorithm. --ignore-midx don't use .midx files, use only .idx files. This is only really useful when used with --predict. EXAMPLE
$ bup margin Reading indexes: 100.00% (1612581/1612581), done. 40 40 matching prefix bits 1.94 bits per doubling 120 bits (61.86 doublings) remaining 4.19338e+18 times larger is possible Everyone on earth could have 625878182 data sets like yours, all in one repository, and we would expect 1 object collision. $ bup margin --predict PackIdxList: using 1 index. Reading indexes: 100.00% (1612581/1612581), done. 915 of 1612581 (0.057%) SEE ALSO
bup-midx(1), bup-save(1) BUP
Part of the bup(1) suite. AUTHORS
Avery Pennarun <apenwarr@gmail.com>. Bup unknown- bup-margin(1)
All times are GMT -4. The time now is 03:07 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy