Sponsored Content
Top Forums Shell Programming and Scripting Extracting non multiple files via script Post 302722649 by emily on Sunday 28th of October 2012 09:40:00 AM
Old 10-28-2012
Hi,
Yeah you are right.
Now, I manage to do what I wanted. Following is the code snipet for that
Code:
 
  for FileNameIndx in "${PATH535[@]}"
      do
      if [[ ! -e "dest_path/$FileNameIndx" ]]; then
          ls -ltr "$FileNameIndx" | grep root | awk -F_ '{print $3,$0}' OFS=\t | sort -n | cut -f2- >> $File0"_0"
          #ls -ltr "$FileNameIndx" | grep root | awk '{print string path $9}' string="$CONSTANT" path="$FileNameIndx"  >> "$File0"                    
          sort -nrk5 < $File0"_0" | awk -F_ '!x[$3]++' >> $File0"_1"
          grep -in "vg" $File0"_1" | awk '{print path string $9}' string="/" path="$FileNameIndx" >> $FileName
          echo "$FileNameIndx is copied"
      else
          echo "Check the FileName in ${PATHNAME[@]}"
      fi
      echo "---------------------------------------------------------"
      echo ">>> DataFiles are from :" ${PATH535[@]}
      echo "---------------------------------------------------------"
    done

well, it is bit lengthy as I am beginner with script. But it works fine for me..Smilie

And about the "vgtree_1", I would prefer the
Code:
 100 Oct 27 10:28 vgtee_1_ujf.root

with large file size.

thanks
emily,

---------- Post updated at 08:40 AM ---------- Previous update was at 06:04 AM ----------

Quote:
Originally Posted by RudiC
So you want to consider chars 1 - 7 of the filename only in order to find "duplicates" (No two or more digit integers possible?), and, if found, use the larger size file name?
I don't see any attempt to use either criterion in your code snippet? BTW, vgtee_1 would be a duplicate as well, wouldn't it?
Hi RudiC,
yes you are right, occurance of *1* is also duplication.
And now, the problem I am facing is following. The code snippet that I showed in my following mail is working fine but it leave a blank row on the top. I wonder if that can be removed somehow.
I would get the wrong results because of that.

can you help me?

Thanks,
Emily
 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

bash - batch script for extracting one file from multiple tar files

so i have hundreds of files named history.20071112.tar (history.YYYYMMDD.tar) and im looking to extract one file out of each archive called status_YYYYMMDDHH:MM.lis here is what i have so far: for FILE in `cat dirlist` do tar xvf $FILE ./status_* done dirlist is a text... (4 Replies)
Discussion started by: kuliksco
4 Replies

2. Shell Programming and Scripting

Help in extracting multiple files and taking average at same time

Hi, I have 20 files which have respective 50 lines with different values. I would like to process each line of the 50 lines in these 20 files one at a time and do an average of 3rd field ($3) of these 20 files. This will be output to an output file. Instead of using join to generate whole... (8 Replies)
Discussion started by: ahjiefreak
8 Replies

3. UNIX for Advanced & Expert Users

Extracting files with multiple links-perl

i want to write a perl script that gets/displays all those files having multiple links (in current directory) (4 Replies)
Discussion started by: guptesanket
4 Replies

4. UNIX for Dummies Questions & Answers

Extracting columns from multiple files with awk

hi everyone! I already posted it in scripts, I'm sorry, it's doubled I'd like to extract a single column from 5 different files and put them together in an output file. I saw a similar question for 2 input files, and the line of code workd very well, the code is: awk 'NR==FNR{a=$2; next}... (1 Reply)
Discussion started by: orcaja
1 Replies

5. Shell Programming and Scripting

Extracting columns from multiple files with awk

hi everyone! I'd like to extract a single column from 5 different files and put them together in an output file. I saw a similar question for 2 input files, and the line of code workd very well, the code is: awk 'NR==FNR{a=$2; next} {print a, $2}' file1 file2 I added the file3, file4 and... (10 Replies)
Discussion started by: orcaja
10 Replies

6. UNIX for Dummies Questions & Answers

Finding and Extracting uniq data in multiple files

Hi, I have several files that look like this: File1.txt Data1 Data2 Data20 File2.txt Data1 Data5 Data10 File3.txt Data1 Data2 Data17 File4.txt (6 Replies)
Discussion started by: Fahmida
6 Replies

7. Shell Programming and Scripting

Extracting/condensing text from multiple files to multiples files

Hi Everyone, I'm really new to all this so I'm really hoping someone can help. I have a directory with ~1000 lists from which I want to extract lines from and write to new files. For simplicity lets say they are shopping lists and I want to write out the lines corresponding to apples to a new... (2 Replies)
Discussion started by: born2phase
2 Replies

8. Shell Programming and Scripting

extracting information from multiple files

Hello there, I am trying to extract (string) information ( a list words) from 4 files and then put the results into 1 file. Currently I am doing this using grep -f list.txt file1 . and repeat the process for the other 3 files. The reasons i am doing that (a) I do know how to code (b) each file... (4 Replies)
Discussion started by: houkto
4 Replies

9. Shell Programming and Scripting

Extracting lines based on identifiers into multiple files respectively

consider the following is the contents of the file cat 11.sql drop procedure if exists hoop1 ; Delimiter $$ CREATE PROCEDURE hoop1(id int) BEGIN END $$ Delimiter ; . . . . drop procedure if exists hoop2; Delimiter $$ CREATE PROCEDURE hoop2(id int) BEGIN END $$ (8 Replies)
Discussion started by: vivek d r
8 Replies

10. Shell Programming and Scripting

Extracting specific files from multiple .tgz files

Hey, I have number of .tgz files and want to extract the file with the ending *results.txt from each one. I have tried for file in *.tgz; do tar --wildcards -zxf $file *results.txt; doneas well as list=$(ls *.tgz) for i in $list; do tar --wildcards -zxvf $i *.results.txt; done... (1 Reply)
Discussion started by: jfern
1 Replies
GEOIP_REGION_NAME_BY_CODE(3)						 1					      GEOIP_REGION_NAME_BY_CODE(3)

geoip_region_name_by_code - Returns the region name for some country and region code combo

SYNOPSIS
string geoip_region_name_by_code (string $country_code, string $region_code) DESCRIPTION
The geoip_region_name_by_code(3) function will return the region name corresponding to a country and region code combo. In the United States, the region code corresponds to the two-letter abbreviation of each state. In Canada, the region code corresponds to the two-letter province or territory code as attributed by Canada Post. For the rest of the world, GeoIP uses FIPS 10-4 codes to represent regions. You can check http://www.maxmind.com/app/fips10_4 for a detailed list of FIPS 10-4 codes. This function is always available if using GeoIP Library version 1.4.1 or newer. The data is taken directly from the GeoIP Library and not from any database. PARAMETERS
o $country_code - The two-letter country code (see geoip_country_code_by_name(3)) o $region_code - The two-letter (or digit) region code (see geoip_region_by_name(3)) RETURN VALUES
Returns the region name on success, or FALSE if the country and region code combo cannot be found. EXAMPLES
Example #1 A geoip_region_name_by_code(3) example using region code for US/Canada This will print the region name for country CA (Canada), region QC (Quebec). <?php $region = geoip_region_name_by_code('CA', 'QC'); if ($region) { echo 'Region name for CA/QC is: ' . $region; } ?> The above example will output: Region name for CA/QC is: Quebec Example #2 A geoip_region_name_by_code(3) example using FIPS codes This will print the region name for country JP (Japan), region 01. <?php $region = geoip_region_name_by_code('JP', '01'); if ($region) { echo 'Region name for JP/01 is: ' . $region; } ?> The above example will output: Region name for JP/01 is: Aichi PHP Documentation Group GEOIP_REGION_NAME_BY_CODE(3)
All times are GMT -4. The time now is 09:46 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy