I have a data file:
abc Text Text Text Unique Text
123 Text word Line Unique Text
fgh Text data Line Unique Text
789 Text Text Line Unique Text
543 Text Text Data Unique Text
and a filter file
123
789
I want to extract out from the data file the two records that contain the keys... (1 Reply)
hi all,
I have this file with some user data.
example:
$cat myfile.txt
FName|LName|Gender|Company|Branch|Bday|Salary|Age
aaaa|bbbb|male|cccc|dddd|19900814|15000|20|
eeee|asdg|male|gggg|ksgu|19911216|||
aara|bdbm|male|kkkk|acke|19931018||23|
asad|kfjg|male|kkkc|gkgg|19921213|14000|24|... (4 Replies)
i have file server 1 (filesvr01acess.log) and disc server 1 (discsvr01acess.log) in unix box(say ip adress of the box 10.39.66.81)
Similiarly i have file server 2 (filesvr01acess.log) and disc server 2(discsvr01acess.log) in another unix box(say ip adress of the box 10.39.66.82).
Now my... (1 Reply)
Hi All ,
I have one file like below ,
Owner name = abu2-kxy-m29.hegen.app
Item_id = AX1981, special_id = *NULL*, version = 1
Created = 09/01/2010 12:56:56 (1283389016)
Enddate = 03/31/2011 00:00:00 (1301554800)
From the above file I need to get the output in the below format ,i need... (3 Replies)
Hi,
I have a file with hundreds of records.
There are four fields on each line, separated by semicolons.
Name
Height (meters)
Country
Continent (Africa,Asia,Europe,North America,Oceania,South
America,The Poles)
I need to Write the command to find display how many mountains appear... (1 Reply)
I have a .kml file. So I want filter the .kml to get only the tags that have this numeric codes that they are in a text file
11951
11952
74014
11964
11965
11969
11970
11971
11972
60149
74018
74023
86378
11976
11980
11983
11984
11987 (5 Replies)
Hi...
I would like to filter out my data file....in two different way
1st way is like this, I will take one example..here...
The script should ask like this.
Enter min value in first column
Enter max value in first column
Enter min value in second column
Enter max value in... (5 Replies)
Hello,
I am very now to this, hope you can help,
I am looking into editing a file in Solaris, with dinamic collums (lenght varies) and I need 2 things to be made, the fist is to filter the first column and third column from the file bellow file.txt, and create a new file with the 2 filtered... (8 Replies)
Hi Folks,
I have a text file with lots of rows with duplicates in the first column, i want to filter out records based on filter columns in a different filter text file.
bash scripting is what i need.
Data.txt
Name OrderID Quantity
Sam 123 300
Jay 342 498
Kev 78 2500
Sam 420 50
Vic 10... (3 Replies)
Discussion started by: tech_frk
3 Replies
LEARN ABOUT DEBIAN
grdcut
GRDCUT(l) GRDCUT(l)
NAME
grdcut - Extract a subregion out of a .grd file
SYNOPSIS
grdcut input_file.grd -Goutput_file.grd -Rwest/east/south/north[r] [ -V ]
DESCRIPTION
grdcut will produce a new output_file.grd file which is a subregion of input_file.grd. The subregion is specified with
-Rwest/east/south/north as in other programs; the specified range must not exceed the range of input_file.grd. If in doubt, run grdinfo to
check range. Complementary to grdcut there is grdpaste, which will join together two grdfiles along a common edge.
input_file.grd
this is the input .grd format file.
-Goutput_file.grd
this is the output .grd format file.
-R west, east, south, and north specify the Region of interest. To specify boundaries in degrees and minutes [and seconds], use the
dd:mm[:ss] format. Append r if lower left and upper right map coordinates are given instead of wesn. This defines the subregion to
be cut out.
OPTIONS -V Selects verbose mode, which will send progress reports to stderr [Default runs "silently"].
EXAMPLES
Suppose you have used surface to grid ship gravity in the region between 148E - 162E and 8N - 32N, and you do not trust the gridding near
the edges, so you want to keep only the area between 150E - 160E and 10N - 30N, then:
grdcut grav_148_162_8_32.grd -Ggrav_150_160_10_30.grd -R150/160/10/30 -V
SEE ALSO grdpaste(1gmt), grdinfo(1gmt), gmt(1gmt)
1 Jan 2004 GRDCUT(l)