How to zip csv files having specific pattern in a directory using UNIX shell script?
I have files in a Linux directory . Some of the file is listed below
Code:
-rw-rw-r--. 1 roots roots 0 Dec 23 02:17 zzz_123_00000_A_1.csv
-rw-rw-r--. 1 roots roots 0 Dec 23 02:18 zzz_121_00000_A_2.csv
-rw-rw-r--. 1 roots roots 0 Dec 23 02:18 zzz_124_00000_A_3.csv
drwxrwxr-x. 2 roots roots 6 Dec 23 02:18 zzz
-rw-rw-r--. 1 roots roots 0 Dec 23 02:54 yyy_123_343434_A_1.csv
-rw-rw-r--. 1 roots roots 0 Dec 23 02:55 yyy_123_343434_A_1.xml
-rw-rw-r--. 1 roots roots 0 Dec 23 02:55 yyy_1254_343434_A_1.csv
-rw-rw-r--. 1 roots roots 0 Dec 23 02:55 yyy_1254_343434_A_1.txt
drwxrwxr-x. 2 roots roots 6 Dec 23 02:56 yyy
In my directory other file formats also there with same name. And my direcotry might have sub directory also. I should not consider other files and sub directories for zip process.
Once zip is done,I have to move this csv files into archive directory. I have to write unix script.
Expected Output: zzz_timestamp.zip should have zzz_123_00000_A_1.csv,zzz_121_00000_A_2.csv and zzz_124_00000_A_3.csv yyy_timestamp.zip should have yyy_123_343434_A_1.csv,yyy_1254_343434_A_1.csv
Please let me know how to implement this task.
Last edited by Don Cragun; 12-23-2016 at 05:55 AM..
Reason: Add CODE and ICODE tags.
Hi,
I have a shell script which is to perform a check if all 4 particular type of files exists in a directory.
If ALL 4 files are present within a specific Timeframe, then tar these files and zip it.
If not all 4 files are present in the directory after the specific timeframe, then tar... (1 Reply)
Hello,
Below is my input file's content ( in HP-UX platform ):
ABCD120672-B21 1
ABCD142257-002 1
ABCD142257-003 1
ABCD142257-006 1
From the above, I just want to get the field of 13 characters that comes after 'ABCD' i.e '120672-B21'... . Could... (2 Replies)
help trying to figure out a batch shell script to zip each file in a directory into its own zip file
using this code but it does not work
tryed this also
nothing seems to work , just ends without zipping any files
i have over 3000 files i need to zip up individualy
... (7 Replies)
Hi Guru's,
I have to write a shell script which groups file names based upon the certain matching string pattern, then creates the Tar file for that particular group of files and then zips the Tar file created for the respective group of files.
For example, In the given directory these files... (3 Replies)
I admit I am terrible with scripting, so when I was asked to store users' command history lines and zip them on monthly basis what I did was to create a file "user_history_Feb" with the following contents:
Part A
# more user_history_Feb
cp -p /var/log/user_history/*history... (6 Replies)
Hi,
Is there a way to find all the files from a specific location and then zip them into a single file, even if they are in multiple directories? (3 Replies)
Hello,
I have Fedora 14 installed on my machine I have a .zip file ( some latex package) which I want to unzip to some location in the Latex paths /usr/share.../texmf/..
so I went to super user mode, created the directory for this package over there,
mkdir logo
and tried... (1 Reply)
Hi Folks,
There is a job which generates a .zip files every day at /usr/app/generated directory , now please advise for the script that will delete this zip files permanently.but while deleting it should make sure that it will not delete the last two days recently generated zip files and this... (1 Reply)
Hi Expert,
We have some shell scripts which Internally uses Perl Script to Unzip the source zip files which comes to inbound directory. So now our requirement is to avoid the dependency on Perl Script and us Shell Script to unzip the files. I have the Perl script with me attached can some one... (3 Replies)
I need a script which should watch a directory for a file with specific directory.
If it finds a file in directory, it should search for few specific keyword in the file. if the keyword exists, it should trim string from specific column.
The file should be moved to another directory and the a... (8 Replies)
Discussion started by: akashdeepak
8 Replies
LEARN ABOUT DEBIAN
sfood
SFOOD(1) General Commands Manual SFOOD(1)NAME
sfood - detect import statements using the AST parser
SYNOPSIS
sfood [options] files ...
DESCRIPTION
This script outputs a comma-separated list of tuples:
((from_root, from_filename), (to_root, to_filename))
The roots are the root directories where the modules lie. You can use sfood-graph or some other tool to filter, cluster and generate a
meaningful graph from this list of dependencies.
As a special case, if the 'to' tuple is (None, None), this means to at least include the 'from' tuple as a node. This may happen if the
file has no dependencies on anything.
As inputs, it can receive either files or directories; in case no argument is passed, it parses the current directory recursively.
OPTIONS -h, --help
show the help message and exit
-i, --internal, --internal-only
Filter out dependencies that are outside of the roots of the input files. If internal is used twice, we filter down further the
dependencies to the # set of files that were processed only, not just to the files that live in the same roots.
-I IGNORES, --ignore=IGNORES
Add the given directory name to the list to be ignored.
-v, --verbose
Output more debugging information
-f, -r, --follow, --recursive
Follow the modules depended upon and trace their dependencies. WARNING: This can be slow. Use --internal to limit the scope.
--print-roots
Only print the package roots corresponding to the input files.This is mostly used for testing and troubleshooting.
-d, --disable-pragmas
Disable processing of pragma directives as strings after imports.
-u, --ignore-unused
Automatically ignore unused imports. (See sfood-checker(1))
SEE ALSO sfood-checker(1), sfood-cluster(1), sfood-copy(1), sfood-flatten(1), sfood-graph(1), sfood-imports(1).
AUTHOR
sfood was written by Martin Blais <blais@furius.ca> and it's part of snakefood suite.
This manual page was written by Sandro Tosi <morph@debian.org>, for the Debian project (and may be used by others).
January 2, 2009 SFOOD(1)