Search Results

Search: Posts Made By: dhruuv369
20,024
Posted By jim mcnamara
FTP almost always returns 0. The return code is...
FTP almost always returns 0. The return code is not normally useful.
You have to parse the responses from an ftp server to figure out what is going on. A 3 digit number is first followed by text. ...
20,024
Posted By Corona688
OK, something like: ftp <<EOF 2>&1 > /tmp/$$...
OK, something like:

ftp <<EOF 2>&1 > /tmp/$$
...
EOF

if [ "$?" -ne 0 ] || grep "No files" /tmp/$$ >/dev/null
then
echo "Error in file transfer"
fi

rm -f /tmp/$$

I suspect...
10,348
Posted By RudiC
I'm afraid I cannot help you as the logics ...
I'm afraid I cannot help you as the logics required escape me. I don't see the directories' structures, nor the request files / flags names, nor what they trigger, and when. The flags_info.txt...
10,348
Posted By Don Cragun
I wanted specifics. You provided generalities. ...
I wanted specifics. You provided generalities. With what you have given us, the script needs to be something like:
cd somewhere
while [ 1 ]
do for file in somepattern
do if [...
17,212
Posted By Don Cragun
By setting FS and OFS to a double quote character...
By setting FS and OFS to a double quote character (-F'"' -v OFS='"'), in2nix4life told awk to use the double quote character as the field separator when lines are being read from standard input file...
17,212
Posted By in2nix4life
awk -F'"' -v OFS='"' '{for(i=2;i<NF;i+=2)...
awk -F'"' -v OFS='"' '{for(i=2;i<NF;i+=2) gsub(",", "", $i)}1'
6,894
Posted By neutronscott
try var=$(awk 'END{print NR}' file1 file2...
try


var=$(awk 'END{print NR}' file1 file2 file3)
4,421
Posted By ctsgnb
... much more simply : # sed...
... much more simply :
# sed '1!{s;\([^,]*\)\(\(,[^,]*\)\{5\}\)$;\2;}' fich.csv
column 1,column 2,column 3,column 4,column 5,column 6,column 7,column 8,column 9,column 10
12310,42324564756,"a...
4,421
Posted By disedorgue
Hi, in sed: $ cat fich.csv column 1,column...
Hi, in sed:
$ cat fich.csv
column 1,column 2,column 3,column 4,column 5,column 6,column 7,column 8,column 9,column 10
12310,42324564756,"a simple string with a , comma","string with or, without...
4,421
Posted By ctsgnb
I meanwhile also updated my post, please read #8...
I meanwhile also updated my post, please read #8 and #10 .

By the way in fact this should also work (add NR>1 if you need to skip the first line containing the header):
awk -F, '{$(NF-5)=z}1'...
4,421
Posted By Akshay Hegde
This might be lengthy you may try, second line...
This might be lengthy you may try, second line has unpaired double quotes

$ cat file
column 1,column 2,column 3,column 4,column 5,column 6,column 7,column 8,column 9,column 10...
4,421
Posted By ctsgnb
How has your file been generated ? It looks...
How has your file been generated ?

It looks it is corrupted since the line
23455,12312255564,"string,, multiple, commas,string with or, without commas,string 2,USD,433,70%,07/15/2013,......
4,421
Posted By RudiC
This might come close to what you want to do...
This might come close to what you want to do (tested only on mawk 1.3.3):awk 'NR>1 {for (i=2; i<=NF; i+=2) gsub (/,/,"\001", $i)
FS=OFS=","; $0=$0; $5=" "; FS=OFS="\"";
...
4,421
Posted By ctsgnb
As a first step you can for example make the...
As a first step you can for example make the difference between comma that are field separator and comma that are inside fields' values :

If there are multiple comma everywhere in you file like :...
20,319
Posted By Corona688
ASCII mode is what your FTP client calls text...
ASCII mode is what your FTP client calls text mode. This is not a huge leap -- especially when I've been harping about how ASCII only works for text, for pages now.

This is the fourth and last...
20,319
Posted By Corona688
Again, I suspect the file was transferred in...
Again, I suspect the file was transferred in ASCII mode which corrupted it. This is a common pitfall with FTP. You would never notice this problem transferring raw text, but it will scramble...
8,500
Posted By bartus11
nawk '|call nawk NR==FNR|this condition is true...
nawk '|call nawk
NR==FNR|this condition is true only for the first file - File3.CSV in this case
{x=$0;|assign whole line from File3.CSV to variable "x"
next}|stop processing File3.CSV...
8,500
Posted By Soham
Try the following var=\"something\" cat...
Try the following

var=\"something\"
cat File1.CSV | sed 's/^/$var/'

Every month you will have to change the value of var
8,500
Posted By Scrutinizer
Try: VAR1='"string1 to string2"' sed...
Try:
VAR1='"string1 to string2"'
sed "s/^/$VAR1,/" file

or
awk -v v1="$VAR1" '{print v1,$0}' FS=, OFS=, file
Forum: Linux 10-22-2013
18,915
Posted By RudiC
If your awk version allows for multi-char-FS,...
If your awk version allows for multi-char-FS, tryawk -F'","' 'toupper($5)=="STRING 1"' OFS='","' file
Forum: Linux 10-22-2013
18,915
Posted By Don Cragun
You have at least two problems here: First,...
You have at least two problems here:

First, you are telling awk that your field separator is a comma, but some commas in your input file are not field separators.

Second, you are telling awk to...
Forum: Linux 10-16-2013
4,550
Posted By blackrageous
If input is in file x.x, then.... cat x.x |...
If input is in file x.x, then....

cat x.x | sed -e 's/,"/\|/g' | cut -f9 -d\| | tr -d \" | awk -F\- 'BEGIN{MON["JAN"] = "01"; MON["FEB"] = "02"; MON["MAR"] = "03"; MON["APR"] = "04"; MON["MAY"] =...
Forum: Linux 10-16-2013
4,550
Posted By CarloM
From here...
From here (https://www.unix.com/shell-programming-scripting/169150-multiple-delimeters-awk.html#post302564584) and your date code.

$ cat datefile.csv...
17,839
Posted By RudiC
Yoda's proposal is working fine if the quoted...
Yoda's proposal is working fine if the quoted column is the last one you want to keep. Try this for an arbitrary last column:awk ' {for (i=2; i<=NF; i+=2) { #...
17,839
Posted By Yoda
awk -F, ' # Set...
awk -F, ' # Set comma as input field separator
NR == 1 { # If NR (total records read) == 1
NF -= 3 #...
Showing results 1 to 25 of 26

 
All times are GMT -4. The time now is 07:58 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy