Sponsored Content
Top Forums Shell Programming and Scripting grep and delete 2nd duplicated of txt... -part2 Post 302103498 by Krrishv on Friday 19th of January 2007 01:44:06 AM
Old 01-19-2007
chechout this script

#! /usr/bin/bash
cat $1|sort -c
if [ $? -eq 1 ]; then
sort -u $1|tee $1
else
echo "Already sorted"
fi

let me know if you have any questions. $1 is your input filename
 

10 More Discussions You Might Find Interesting

1. Windows & DOS: Issues & Discussions

sftp not get txt file 2nd time

I use sftp batch script to copy file from a remote server to local machine. Problem is when I rerun this shell after copying txt file it do not allow me to copy txt file again. My question is , is this confrigration problem of remote server. beace same shell work fie with the other remote... (3 Replies)
Discussion started by: Jamil Qadir
3 Replies

2. Shell Programming and Scripting

sed to read x.txt and grep from y.txt

How would I write a command(s) to read from a file (list) that looks like this: 29847374384 and grep from a second file (list) that looks like this: 29847374384, jkdfkjdf,3833,ddd:confused: (1 Reply)
Discussion started by: smellylizzard
1 Replies

3. Shell Programming and Scripting

delete duplicated characters in each line

I'm a biologist trying to analyse some data and I'll appreciate some help with the following problem. I have a column of characters which I'll like to delete the duplicated characters in each line and report only the unique one.No sorting should be done. E.g. The original data: GTG CTC CTC... (5 Replies)
Discussion started by: ivpz
5 Replies

4. Shell Programming and Scripting

Delete first and second lines from .txt

Hi, I have to programming a program in shell script which combine a lot of .txt files. But in all of these files the program had to delete the first both lines. Because I don't know anything about shell script I need your help. Does anyone have a command or a hint, where I can look for? ... (1 Reply)
Discussion started by: mkrol
1 Replies

5. Homework & Coursework Questions

Delete first both lines from .txt

Hi, I have to programming a program in shell script which combine a lot of .txt files. But in all of these files the program had to delete the first both lines. Because I don't know anything about shell script I need your help. Does anyone have a command or a hint, where I can look for? ... (1 Reply)
Discussion started by: mkrol
1 Replies

6. UNIX for Dummies Questions & Answers

If then else for decimal numbers part2

Hi, I have a small problem with my script. I have everything in order but it doesnt seem to compare anything less than 1 correctly. If the input is more than 1, then the results is correct. If the input is 0.xxx (anything) it returns erroneous results. Pls help input=0.12 if ; then ... (7 Replies)
Discussion started by: streddy
7 Replies

7. Shell Programming and Scripting

Delete file2.txt from file1.txt using scripting

Hi, I`m a total newbie, well my requirement is that i have 2 files I want to identify which countries i do not currently have in db.. how can i use the grep or another command to find this file .. i want to match all-countries.txt with countries-in-db.txt so the output is equal to... (11 Replies)
Discussion started by: beanbaby
11 Replies

8. Shell Programming and Scripting

Delete duplicated fields in a line

Hi, I have files with this kind of format (separator is space): A1 B1 C1 D1 E1 F1 D1 C1 G1 H1 A2 B2 C2 D2 E2 F2 D2 C2 G2 H2 A3 B3 C3 D3 E3 F3 G3 D3 C3 H3 A4 B4 C4 D4 E4 F4 G4 D4 C4 H4 I want the output to be: A1 B1 E1 F1 G1 H1 A2 B2 E2 F2 G2 H2 A3 B3 E3 F3 G3 H3 A4 B4 E4 F4 G4... (12 Replies)
Discussion started by: Gr4wk
12 Replies

9. UNIX for Dummies Questions & Answers

Delete files in a txt file

Hi, I have very old files in my server like from 2012 and i want to delete them, Please help. Thanks in advance.. (2 Replies)
Discussion started by: nanz143
2 Replies

10. Shell Programming and Scripting

How to delete 'duplicated' column values and make a delimited file too?

Hi, I have the following output from an Oracle SQL statement and I want to remove duplicated column values. I know it is possible using Oracle analytical/statistical functions but unfortunately I don't know how to use any of those. So now, I've gone to PLAN B using awk/sed maybe or any... (5 Replies)
Discussion started by: newbie_01
5 Replies
FC-MATCH(1)															       FC-MATCH(1)

NAME
fc-match - match available fonts SYNOPSIS
fc-match [ -asvVh ] [ --all ] [ --sort ] [ --verbose ] [ [ -f format ] [ --format format ] ] [ --version ] [ --help ] [ pattern [ element... ] ] DESCRIPTION
fc-match matches pattern (empty pattern by default) using the normal fontconfig matching rules to find the best font available. If --sort is given, the sorted list of best matching fonts is displayed. The --all option works like --sort except that no pruning is done on the list of fonts. If any elements are specified, only those are printed. Otherwise short file name, family, and style are printed, unless verbose output is requested. OPTIONS
This program follows the usual GNU command line syntax, with long options starting with two dashes (`-'). A summary of options is included below. -a Displays sorted list of best matching fonts, but do not do any pruning on the list. -s Displays sorted list of best matching fonts. -v Print verbose output of the whole font pattern for each match, or elements if any is provided. -f Format output according to the format specifier format. -V Show version of the program and exit. -h Show summary of options. pattern Displays fonts matching pattern (uses empty pattern by default). element If set, the element property is displayed for matching fonts. SEE ALSO
fc-list(1) FcFontMatch(3) FcFontSort(3) FcPatternFormat(3) fc-cat(1) fc-cache(1) fc-pattern(1) fc-query(1) fc-scan(1) The fontconfig user's guide, in HTML format: /usr/share/doc/fontconfig/fontconfig-user.html. AUTHOR
This manual page was updated by Patrick Lam <plam@csail.mit.edu>. Aug 13, 2008 FC-MATCH(1)
All times are GMT -4. The time now is 02:41 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy