Sponsored Content
Top Forums Shell Programming and Scripting Data Splitting into two files from one file Post 302554970 by Corona688 on Tuesday 13th of September 2011 01:49:38 PM
Old 09-13-2011
Code:
# Print the last field into file2:   print $NF >"file2"
# Remove the last field:             $NF=""
# Print everything else into file1:  print >"file1"
awk '{ print $NF >"file2" ; $NF="" ; print >"file1" }' < input

 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

Splitting data file

Hello, I'm trying to split a file by lines. I know that I can use the split command to do this, but the one problem I'm having is, each file created, the first line needs to be a header. I can use the split command the create another file with the header, then append the new split file to... (4 Replies)
Discussion started by: ctcuser
4 Replies

2. Shell Programming and Scripting

Splitting files from one file

Hi, I have an input file like: 111 abcdefgh asdfghjk dfghjkl 222 aaaaaaa bbbbbb 333 djfhfgjktitjhgfkg 444 djdhfjkhfjkghjkfg hsbfjksdbhjkgherjklg fjkhfjklsahjgh fkrjkgnj I want to read this input file and make separate output files with the header as numric value like "111"... (9 Replies)
Discussion started by: saltysumi
9 Replies

3. Shell Programming and Scripting

Splitting file into 2 files ?

Hi extending to one of my previous posted query .... I am using nawk -v invar1="$aa" '{print > ("ABS\_"((/\|/)?"A\_":"B\_")invar1"\_NETWORKID.txt")}' spfile.txt to get 2 different files based on split condition i.e. "|" Similar to invar1 variable in nawk I also need one more variable... (18 Replies)
Discussion started by: shekharjchandra
18 Replies

4. UNIX for Dummies Questions & Answers

Splitting Data in File

I have a file with the below Data 1,nj@ny@pa@caa 2,ct 3,ca@vaa@txI want the output to be 1,nj 1,ny 1,pa 1,caa 2,ct 3,ca 3,vaa 3,tx I need to split the second column based on @ as delimiter The number of delimiters is unknown (4 Replies)
Discussion started by: traininfa
4 Replies

5. Shell Programming and Scripting

Help me pls : splitting single file in unix into different files based on data

I have a file in unix with sample data as follows : -------------------------------------------------------------- -------------------------------------------------------------- {30001002|XXparameter|Layout|$ I want this file to be splitted into different files and corresponding to the sample... (54 Replies)
Discussion started by: Ravindra Swan
54 Replies

6. UNIX for Dummies Questions & Answers

Extracting data from one file, based on another file (splitting)

Dear All, I have two files but want to extract data from one based on another... can you please help me file 1 David Tom Ellen and file 2 David|0010|testnamez|resultsz David|0004|testnamex|resultsx Tom|0010|testnamez|resultsz Tom|0004|testnamex|resultsx Ellen|0010|testnamez|resultsz... (12 Replies)
Discussion started by: A-V
12 Replies

7. Shell Programming and Scripting

Splitting a file into 4 files containing the same name pattern

Hello, I have one file which is in size around 20 MB , wanted to split up into four files of each size of 5 MB. ABCD_XYZ_20130302223203.xml. Requirement is that to write script which should do as : first three file should be of size 5 MB each, the fourth one content should be in the last... (8 Replies)
Discussion started by: ajju
8 Replies

8. Open Source

Splitting files using awk and reading filename value from input data

I have a process that requires me to read data from huge log files and find the most recent entry on a per-user basis. The number of users may fluctuate wildly month to month, so I can't code for it with names or a set number of variables to capture the data, and the files are large so I don't... (7 Replies)
Discussion started by: rbatte1
7 Replies

9. Shell Programming and Scripting

awk issue splitting a fixed-width file containing line feed in data

Hi Forum. I have the following script that splits a large fixed-width file into smaller multiple fixed-width files based on input segment type. The main command in the script is: awk -v search_col_pos=$search_col_pos -v search_str_len=$search_str_len -v segment_type="$segment_type"... (8 Replies)
Discussion started by: pchang
8 Replies

10. Shell Programming and Scripting

Splitting the XML file into three different files

Hello Shell Guru's I have a requirement to split the source xml file into three different text file. And i need your valuable suggestion to finish this. Here is my source xml snippet, here i am using only one entry of <jms-system-resource>. There may be multiple entries in the source file. ... (5 Replies)
Discussion started by: Siv51427882
5 Replies
DateTime::Locale::tr(3) 				User Contributed Perl Documentation				   DateTime::Locale::tr(3)

NAME
DateTime::Locale::tr SYNOPSIS
use DateTime; my $dt = DateTime->now( locale => 'tr' ); print $dt->month_name(); DESCRIPTION
This is the DateTime locale package for Turkish. DATA
This locale inherits from the DateTime::Locale::root locale. It contains the following data. Days Wide (format) Pazartesi SalX CarXamba PerXembe Cuma Cumartesi Pazar Abbreviated (format) Pzt Sal Car Per Cum Cmt Paz Narrow (format) P S C P C C P Wide (stand-alone) Pazartesi SalX CarXamba PerXembe Cuma Cumartesi Pazar Abbreviated (stand-alone) Pzt Sal Car Per Cum Cmt Paz Narrow (stand-alone) P S C P C C P Months Wide (format) Ocak Xubat Mart Nisan MayXs Haziran Temmuz AXustos Eyluel Ekim KasXm AralXk Abbreviated (format) Oca Xub Mar Nis May Haz Tem AXu Eyl Eki Kas Ara Narrow (format) O X M N M H T A E E K A Wide (stand-alone) Ocak Xubat Mart Nisan MayXs Haziran Temmuz AXustos Eyluel Ekim KasXm AralXk Abbreviated (stand-alone) Oca Xub Mar Nis May Haz Tem AXu Eyl Eki Kas Ara Narrow (stand-alone) O X M N M H T A E E K A Quarters Wide (format) 1. ceyrek 2. ceyrek 3. ceyrek 4. ceyrek Abbreviated (format) C1 C2 C3 C4 Narrow (format) 1 2 3 4 Wide (stand-alone) 1. ceyrek 2. ceyrek 3. ceyrek 4. ceyrek Abbreviated (stand-alone) C1 C2 C3 C4 Narrow (stand-alone) 1 2 3 4 Eras Wide Milattan Oence Milattan Sonra Abbreviated MOe MS Narrow MOe MS Date Formats Full 2008-02-05T18:30:30 = 05 Xubat 2008 SalX 1995-12-22T09:05:02 = 22 AralXk 1995 Cuma -0010-09-15T04:44:23 = 15 Eyluel -10 Cumartesi Long 2008-02-05T18:30:30 = 05 Xubat 2008 1995-12-22T09:05:02 = 22 AralXk 1995 -0010-09-15T04:44:23 = 15 Eyluel -10 Medium 2008-02-05T18:30:30 = 05 Xub 2008 1995-12-22T09:05:02 = 22 Ara 1995 -0010-09-15T04:44:23 = 15 Eyl -10 Short 2008-02-05T18:30:30 = 05.02.2008 1995-12-22T09:05:02 = 22.12.1995 -0010-09-15T04:44:23 = 15.09.-010 Default 2008-02-05T18:30:30 = 05 Xub 2008 1995-12-22T09:05:02 = 22 Ara 1995 -0010-09-15T04:44:23 = 15 Eyl -10 Time Formats Full 2008-02-05T18:30:30 = 18:30:30 UTC 1995-12-22T09:05:02 = 09:05:02 UTC -0010-09-15T04:44:23 = 04:44:23 UTC Long 2008-02-05T18:30:30 = 18:30:30 UTC 1995-12-22T09:05:02 = 09:05:02 UTC -0010-09-15T04:44:23 = 04:44:23 UTC Medium 2008-02-05T18:30:30 = 18:30:30 1995-12-22T09:05:02 = 09:05:02 -0010-09-15T04:44:23 = 04:44:23 Short 2008-02-05T18:30:30 = 18:30 1995-12-22T09:05:02 = 09:05 -0010-09-15T04:44:23 = 04:44 Default 2008-02-05T18:30:30 = 18:30:30 1995-12-22T09:05:02 = 09:05:02 -0010-09-15T04:44:23 = 04:44:23 Datetime Formats Full 2008-02-05T18:30:30 = 05 Xubat 2008 SalX 18:30:30 UTC 1995-12-22T09:05:02 = 22 AralXk 1995 Cuma 09:05:02 UTC -0010-09-15T04:44:23 = 15 Eyluel -10 Cumartesi 04:44:23 UTC Long 2008-02-05T18:30:30 = 05 Xubat 2008 18:30:30 UTC 1995-12-22T09:05:02 = 22 AralXk 1995 09:05:02 UTC -0010-09-15T04:44:23 = 15 Eyluel -10 04:44:23 UTC Medium 2008-02-05T18:30:30 = 05 Xub 2008 18:30:30 1995-12-22T09:05:02 = 22 Ara 1995 09:05:02 -0010-09-15T04:44:23 = 15 Eyl -10 04:44:23 Short 2008-02-05T18:30:30 = 05.02.2008 18:30 1995-12-22T09:05:02 = 22.12.1995 09:05 -0010-09-15T04:44:23 = 15.09.-010 04:44 Default 2008-02-05T18:30:30 = 05 Xub 2008 18:30:30 1995-12-22T09:05:02 = 22 Ara 1995 09:05:02 -0010-09-15T04:44:23 = 15 Eyl -10 04:44:23 Available Formats d (d) 2008-02-05T18:30:30 = 5 1995-12-22T09:05:02 = 22 -0010-09-15T04:44:23 = 15 Ed (d E) 2008-02-05T18:30:30 = 5 Sal 1995-12-22T09:05:02 = 22 Cum -0010-09-15T04:44:23 = 15 Cmt EEEd (d EEE) 2008-02-05T18:30:30 = 5 Sal 1995-12-22T09:05:02 = 22 Cum -0010-09-15T04:44:23 = 15 Cmt H (H) 2008-02-05T18:30:30 = 18 1995-12-22T09:05:02 = 9 -0010-09-15T04:44:23 = 4 HHmm (HH:mm) 2008-02-05T18:30:30 = 18:30 1995-12-22T09:05:02 = 09:05 -0010-09-15T04:44:23 = 04:44 hhmm (hh:mm a) 2008-02-05T18:30:30 = 06:30 PM 1995-12-22T09:05:02 = 09:05 AM -0010-09-15T04:44:23 = 04:44 AM HHmmss (HH:mm:ss) 2008-02-05T18:30:30 = 18:30:30 1995-12-22T09:05:02 = 09:05:02 -0010-09-15T04:44:23 = 04:44:23 hhmmss (hh:mm:ss a) 2008-02-05T18:30:30 = 06:30:30 PM 1995-12-22T09:05:02 = 09:05:02 AM -0010-09-15T04:44:23 = 04:44:23 AM Hm (HH:mm) 2008-02-05T18:30:30 = 18:30 1995-12-22T09:05:02 = 09:05 -0010-09-15T04:44:23 = 04:44 hm (h:mm a) 2008-02-05T18:30:30 = 6:30 PM 1995-12-22T09:05:02 = 9:05 AM -0010-09-15T04:44:23 = 4:44 AM Hms (H:mm:ss) 2008-02-05T18:30:30 = 18:30:30 1995-12-22T09:05:02 = 9:05:02 -0010-09-15T04:44:23 = 4:44:23 hms (h:mm:ss a) 2008-02-05T18:30:30 = 6:30:30 PM 1995-12-22T09:05:02 = 9:05:02 AM -0010-09-15T04:44:23 = 4:44:23 AM M (L) 2008-02-05T18:30:30 = 2 1995-12-22T09:05:02 = 12 -0010-09-15T04:44:23 = 9 Md (dd/MM) 2008-02-05T18:30:30 = 05/02 1995-12-22T09:05:02 = 22/12 -0010-09-15T04:44:23 = 15/09 MEd (dd/MM E) 2008-02-05T18:30:30 = 05/02 Sal 1995-12-22T09:05:02 = 22/12 Cum -0010-09-15T04:44:23 = 15/09 Cmt MMM (LLL) 2008-02-05T18:30:30 = Xub 1995-12-22T09:05:02 = Ara -0010-09-15T04:44:23 = Eyl MMMd (dd MMM) 2008-02-05T18:30:30 = 05 Xub 1995-12-22T09:05:02 = 22 Ara -0010-09-15T04:44:23 = 15 Eyl MMMEd (dd MMM E) 2008-02-05T18:30:30 = 05 Xub Sal 1995-12-22T09:05:02 = 22 Ara Cum -0010-09-15T04:44:23 = 15 Eyl Cmt MMMMd (dd MMMM) 2008-02-05T18:30:30 = 05 Xubat 1995-12-22T09:05:02 = 22 AralXk -0010-09-15T04:44:23 = 15 Eyluel MMMMEd (dd MMMM E) 2008-02-05T18:30:30 = 05 Xubat Sal 1995-12-22T09:05:02 = 22 AralXk Cum -0010-09-15T04:44:23 = 15 Eyluel Cmt mmss (mm:ss) 2008-02-05T18:30:30 = 30:30 1995-12-22T09:05:02 = 05:02 -0010-09-15T04:44:23 = 44:23 ms (mm:ss) 2008-02-05T18:30:30 = 30:30 1995-12-22T09:05:02 = 05:02 -0010-09-15T04:44:23 = 44:23 y (y) 2008-02-05T18:30:30 = 2008 1995-12-22T09:05:02 = 1995 -0010-09-15T04:44:23 = -10 yM (M/yyyy) 2008-02-05T18:30:30 = 2/2008 1995-12-22T09:05:02 = 12/1995 -0010-09-15T04:44:23 = 9/-010 yMEd (dd.MM.yyyy EEE) 2008-02-05T18:30:30 = 05.02.2008 Sal 1995-12-22T09:05:02 = 22.12.1995 Cum -0010-09-15T04:44:23 = 15.09.-010 Cmt yMMM (MMM y) 2008-02-05T18:30:30 = Xub 2008 1995-12-22T09:05:02 = Ara 1995 -0010-09-15T04:44:23 = Eyl -10 yMMMEd (dd MMM y EEE) 2008-02-05T18:30:30 = 05 Xub 2008 Sal 1995-12-22T09:05:02 = 22 Ara 1995 Cum -0010-09-15T04:44:23 = 15 Eyl -10 Cmt yMMMM (MMMM y) 2008-02-05T18:30:30 = Xubat 2008 1995-12-22T09:05:02 = AralXk 1995 -0010-09-15T04:44:23 = Eyluel -10 yQ (Q yyyy) 2008-02-05T18:30:30 = 1 2008 1995-12-22T09:05:02 = 4 1995 -0010-09-15T04:44:23 = 3 -010 yQQQ (QQQ y) 2008-02-05T18:30:30 = C1 2008 1995-12-22T09:05:02 = C4 1995 -0010-09-15T04:44:23 = C3 -10 yyMM (MM/yy) 2008-02-05T18:30:30 = 02/08 1995-12-22T09:05:02 = 12/95 -0010-09-15T04:44:23 = 09/-10 yyMMM (MMM yy) 2008-02-05T18:30:30 = Xub 08 1995-12-22T09:05:02 = Ara 95 -0010-09-15T04:44:23 = Eyl -10 yyQ (Q yy) 2008-02-05T18:30:30 = 1 08 1995-12-22T09:05:02 = 4 95 -0010-09-15T04:44:23 = 3 -10 yyQQQQ (QQQQ yy) 2008-02-05T18:30:30 = 1. ceyrek 08 1995-12-22T09:05:02 = 4. ceyrek 95 -0010-09-15T04:44:23 = 3. ceyrek -10 yyyy (y) 2008-02-05T18:30:30 = 2008 1995-12-22T09:05:02 = 1995 -0010-09-15T04:44:23 = -10 Miscellaneous Prefers 24 hour time? Yes Local first day of the week Pazartesi SUPPORT
See DateTime::Locale. AUTHOR
Dave Rolsky <autarch@urth.org> COPYRIGHT
Copyright (c) 2008 David Rolsky. All rights reserved. This program is free software; you can redistribute it and/or modify it under the same terms as Perl itself. This module was generated from data provided by the CLDR project, see the LICENSE.cldr in this distribution for details on the CLDR data's license. perl v5.16.3 2014-06-10 DateTime::Locale::tr(3)
All times are GMT -4. The time now is 10:23 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy