01-01-2011
Cut an output
Hi everyone, happy new year
I have the following output:
2503 nteath 20 0 4892 312
2503 nteath 20 0 5872 312
2503 nteath 20 0 6852 312
2503 nteath 20 0 7832 312
2503 nteath 20 0 8812 312
2503 nteath 20 0 9792 312
2503 nteath 20 0 10772 312
and i want to keep only the highlighted (bold) column.
I used
cut -d' ' -f12
but in this way i lose the last number ( 10772), i think because it's 5-digit.
How can i keep all the numbers in the column?
10 More Discussions You Might Find Interesting
1. UNIX for Dummies Questions & Answers
I am on a Linux system using bash shell.
I only want to see the number in the Use% field as the output.
#df -h /
Filesystem Size Used Avail Use% Mounted on
/dev/dasda1 2.3G 2.1G 51M 98% /
!#/bin/bash
df -h / | awk '{print $5}' | cut -c1-2
Us
98
How do... (2 Replies)
Discussion started by: darthur
2 Replies
2. Shell Programming and Scripting
Hi folks
I have a file with thousands of lines with fixed length fields:
sample (assume x is a blank space)
111333xx444TTTLKOPxxxxxxxxx
I need to make a copy of this file but with only some of the field positions, for example I'd like to copy the sample to the follwing: so I'd like to... (13 Replies)
Discussion started by: HealthyGuy
13 Replies
3. Shell Programming and Scripting
Hi
I have the input file as below
***TEST10067
00567GROSZ 099
00567CTCTSDS90
***TEST20081
08233GROZWEWE
00782GWERW899
***TEST30088
08233GROZWEWE
00782GWERW899
I am finding the lines starting with *** and outputing as below
TEST10067
TEST20081
TEST30088
I need a space between TEST1... (9 Replies)
Discussion started by: dhanamurthy
9 Replies
4. Shell Programming and Scripting
I have a problem with my script. I am using following code
awk -F"," '{print $0,",",substr($2,3,3)}' $REG_InputFileName > $TargetSeqPath/Master.tmp
while read i
do
echo $i > $TargetSeqPath/Ref.tmp
OutFileName=`awk -F"," '{print $3}' $TargetSeqPath/Ref.tmp`
rm -f... (9 Replies)
Discussion started by: manmeet
9 Replies
5. Shell Programming and Scripting
Hi
How do i cut from this output so i only get db name
ps -ef |grep smon
oracle 29375 1 0 Nov 9 ? 8:08 ora_smon_HWEX62
i.e only want HWEX62
also bearling in mind that some database have been running from hours and some servers have different time and date formats... (7 Replies)
Discussion started by: eb222
7 Replies
6. UNIX for Dummies Questions & Answers
Hello,
A file called test2 contains passwd entries from various servers.
- On solaris9 host :
> cat test1 | cut -d ":" -f3 | wc -l
18229
- The same file copied with scp to a redhat host (same cksum) :
$ cat test1 | cut -d ":" -f3 | wc -l
39411
- A perl script running on the... (1 Reply)
Discussion started by: Krafton
1 Replies
7. UNIX for Dummies Questions & Answers
Hi i want to cut the first field of this output obtained form sql script
more dxlocks_test.log.1
SID SERIAL# SPID
---------- ---------- ---------
25 18356 1029
78 39370 1025
136 14361 1027
================================
cut -f1... (2 Replies)
Discussion started by: aishwaryakala
2 Replies
8. Shell Programming and Scripting
Using a ksh script, I'm dumping the data from our sybase database into an output file. This output file is for what ever reason cut at 2GB.
There is enough space on the unix machine and as there is no error message is received I have no clue to start looking for a solution.
#!... (1 Reply)
Discussion started by: bereman
1 Replies
9. Solaris
Hi,
I'm running a command :
pargs 20392 | egrep -e "-f "|cut -d " " -f3 | basename
BUT the o/p of cut is not sending to basename.
the o/p of: pargs 20392 | egrep -e "-f "|cut -d " " -f3 is
/home/staff/Properties.cfg
Appreciated ur help.. (2 Replies)
Discussion started by: axes
2 Replies
10. Shell Programming and Scripting
Hi,
I have the files in the following files in a folder
19996587342
19487656550
19534838736
And i need to get the first 6 characters from the abvoe files
so i used the following script
#!/bin/ksh
for i in 19*
do
txt= `$i | cut -c -6`
echo "$txt"
done
The error is at... (4 Replies)
Discussion started by: smile689
4 Replies
LEARN ABOUT DEBIAN
save_binary_logs
SAVE_BINARY_LOGS(1p) User Contributed Perl Documentation SAVE_BINARY_LOGS(1p)
NAME
save_binary_logs - Concatenating binary or relay logs from the specified file/position to the end of the log. This command is automatically
executed from MHA Manager on failover, and manual execution should not be needed normally.
SYNOPSIS
# Test
$ save_binary_logs --command=test --binlog_dir=/var/lib/mysql --start_file=mysqld-bin.000002
# Saving binary logs
$ save_binary_logs --command=save --binlog_dir=/var/lib/mysql --start_file=mysqld-bin.000002 --start_pos=312
--output_file=/var/tmp/aggregate.binlog
# Saving relay logs
$ save_binary_logs --command=save --start_file=mysqld-relay-bin.000002 --start_pos=312 --relay_log_info=/var/lib/mysql/relay-log.info
--output_file=/var/tmp/aggregate.binlog
save_binary_logs concatenates binary or relay logs from the specified log file/position to the end of the log. This tool is intended to be
invoked from the master failover script(MHA Manager), and manual execution is normally not needed.
DESCRIPTION
Suppose that master is crashed and the latest slave server has received binary logs up to mysqld-bin.000002:312. It is likely that master
has more binary logs. If it is not sent to the slave, slaves will lose all binlogs from mysqld-bin.000002:312. The purpose of the
save_binary_logs is to save binary logs that are not replicated to slaves. If master is reachable through SSH and binary logs are readable,
saving binary logs is possible.
Here is an example:
$ save_binary_logs --command=save --start_file=mysqld-bin.000002 --start_pos=312 --output_file=/var/tmp/aggregate.binlog
Then all binary logs starting from mysqld-bin.000002:312 are concatenated and stored into /var/tmp/aggregate.binlog. If you have binary
logs up to mysqld-bin.000004, the following mysqlbinlog outputs are written.
mysqld-bin.000002:Format Description Event(FDE), plus from 312 to the tail mysqld-bin.000003:from 0 to the tail, excluding FDE
mysqld-bin.000004:from 0 to the tail, excluding FDE
perl v5.14.2 2012-01-08 SAVE_BINARY_LOGS(1p)