Sponsored Content
Top Forums Shell Programming and Scripting Combine Multiple text or csv files column-wise Post 302488478 by venky_ibm on Monday 17th of January 2011 10:36:03 AM
Old 01-17-2011
Hi Anurag

The tabulation got removed when i put in the post. I attaching it as a text file.

I am using paste File1.txt File2.txt File3.txt File4.txt > Final File.txt

Please see attachment

Moderator's Comments:
Mod Comment The tabulation got removed because you didn't use code tags. See what happens... Smilie

Last edited by Scott; 01-17-2011 at 02:02 PM..
 

10 More Discussions You Might Find Interesting

1. UNIX for Advanced & Expert Users

How to compare two text files in column wise?

Hi All, I have two txt files like this File1: no name ---------- 12 aaaa 23 bbbb 55 cccc File2 dname dno ------------ civil 33 mech 55 arch 66 Now i want to compare col1 from File and col2 from File2, if its match i want fetch all columns from... (3 Replies)
Discussion started by: psiva_arul
3 Replies

2. Shell Programming and Scripting

combine multiple files by column into one files already sorted!

I have multiple files; each file contains a certain data in a column view simply i want to combine all those files into one file in columns example file1: a b c d file 2: 1 2 3 4 file 3: G (4 Replies)
Discussion started by: ahmedamro
4 Replies

3. Red Hat

How to find a garbage entry in a column wise text file in Linux?

Suppose I have a file containing :- 1 Apple $50 2 Orange $30 3 Banana $10 4 Guava $25 5 Pine@apple $12 6 Strawberry $21 7 Grapes $12 In the 5th row, @ character inserted. I want through sort command or by any other way this row should either on top or bottom. By sort command garbage... (1 Reply)
Discussion started by: Dipankar Mitra
1 Replies

4. Shell Programming and Scripting

Column wise text adding

Hi I have pasted sample data as below:- in data.txt Please suggest any way out: as the 3rd field is cat data.txt 22:37:34 STARTING abc 22:37:40 FAILURE sadn 00:06:42 STARTING asd 00:06:51 FAILURE ad 02:06:38 STARTING acs 02:06:46 FAILURE cz 04:06:35 STARTING xzc... (1 Reply)
Discussion started by: Gaurav198
1 Replies

5. Shell Programming and Scripting

Insterting column in csv from multiple files

Hello, I have a spec file that contains a lot of strings that looks like this: PC DELL OptiPlex 3010MT i3 3220/2GB/500GB/DVD-RW/FREE DOS / 5Y NBD Intel i3 3220 (Dual Core, 3.30GHz, 3MB, w/ HD2500 Graphics), 2GB (1x2GB) DDR3 PC3-1600MHz, 500GB HDD SATA III 7200rpm, DVD+/-RW (16x),... (9 Replies)
Discussion started by: g9100
9 Replies

6. Shell Programming and Scripting

Shell script for field wise record count for different Files .csv files

Hi, Very good wishes to all! Please help to provide the shell script for generating the record counts in filed wise from the .csv file My question: Source file: Field1 Field2 Field3 abc 12f sLm 1234 hjd 12d Hyd 34 Chn My target file should generate the .csv file with the... (14 Replies)
Discussion started by: Kirands
14 Replies

7. Shell Programming and Scripting

Row bind multiple csv files having different column headers

All, I guess by this time someone asked this kind of question, but sorry I am unable to find after a deep search. Here is my request I have many files out of which 2 sample files provided below. File-1 (with A,B as column headers) A,B 1,2 File-2 (with C, D as column headers) C,D 4,5 I... (7 Replies)
Discussion started by: ks_reddy
7 Replies

8. Shell Programming and Scripting

Search string in multiple files and display column wise

I have 3 files. Each of those files have the same number of records, however certain records have different values. I would like to grep the field in ALL 3 files and display the output with only the differences in column wise and if possible line number File1 Name = Joe Age = 33... (3 Replies)
Discussion started by: sidnow
3 Replies

9. Shell Programming and Scripting

Combine and complete multiple CSV files based on 1 parameter

I have to create a new CSV file based on the value listed on the 3rd column from different CSV files. This is what I need: 1. I should substitute the first column from each file, excluding the headers, with the file name InputXX. 2. Then, I need to look for rows with 0 on the third column in... (7 Replies)
Discussion started by: Xterra
7 Replies

10. UNIX for Beginners Questions & Answers

How do I extract specific column in multiple csv files?

file1: Name,Threshold,Curr Samples,Curr Error%,Curr ART GETHome,100,21601,0.00%,47 GETregistry,100,21592,0.00%,13 GEThomeLayout,100,30466,0.00%,17 file2: Name,Threshold,Curr Samples,Curr Error%,Curr ART GETHome,100,21601,0.00%,33 GETregistry,100,21592,0.00%,22... (6 Replies)
Discussion started by: Raghuram717
6 Replies
WWW::RobotRules(3pm)					User Contributed Perl Documentation				      WWW::RobotRules(3pm)

NAME
WWW::RobotRules - database of robots.txt-derived permissions SYNOPSIS
use WWW::RobotRules; my $rules = WWW::RobotRules->new('MOMspider/1.0'); use LWP::Simple qw(get); { my $url = "http://some.place/robots.txt"; my $robots_txt = get $url; $rules->parse($url, $robots_txt) if defined $robots_txt; } { my $url = "http://some.other.place/robots.txt"; my $robots_txt = get $url; $rules->parse($url, $robots_txt) if defined $robots_txt; } # Now we can check if a URL is valid for those servers # whose "robots.txt" files we've gotten and parsed: if($rules->allowed($url)) { $c = get $url; ... } DESCRIPTION
This module parses /robots.txt files as specified in "A Standard for Robot Exclusion", at <http://www.robotstxt.org/wc/norobots.html> Webmasters can use the /robots.txt file to forbid conforming robots from accessing parts of their web site. The parsed files are kept in a WWW::RobotRules object, and this object provides methods to check if access to a given URL is prohibited. The same WWW::RobotRules object can be used for one or more parsed /robots.txt files on any number of hosts. The following methods are provided: $rules = WWW::RobotRules->new($robot_name) This is the constructor for WWW::RobotRules objects. The first argument given to new() is the name of the robot. $rules->parse($robot_txt_url, $content, $fresh_until) The parse() method takes as arguments the URL that was used to retrieve the /robots.txt file, and the contents of the file. $rules->allowed($uri) Returns TRUE if this robot is allowed to retrieve this URL. $rules->agent([$name]) Get/set the agent name. NOTE: Changing the agent name will clear the robots.txt rules and expire times out of the cache. ROBOTS.TXT The format and semantics of the "/robots.txt" file are as follows (this is an edited abstract of <http://www.robotstxt.org/wc/norobots.html>): The file consists of one or more records separated by one or more blank lines. Each record contains lines of the form <field-name>: <value> The field name is case insensitive. Text after the '#' character on a line is ignored during parsing. This is used for comments. The following <field-names> can be used: User-Agent The value of this field is the name of the robot the record is describing access policy for. If more than one User-Agent field is present the record describes an identical access policy for more than one robot. At least one field needs to be present per record. If the value is '*', the record describes the default access policy for any robot that has not not matched any of the other records. The User-Agent fields must occur before the Disallow fields. If a record contains a User-Agent field after a Disallow field, that constitutes a malformed record. This parser will assume that a blank line should have been placed before that User-Agent field, and will break the record into two. All the fields before the User-Agent field will constitute a record, and the User-Agent field will be the first field in a new record. Disallow The value of this field specifies a partial URL that is not to be visited. This can be a full path, or a partial path; any URL that starts with this value will not be retrieved Unrecognized records are ignored. ROBOTS.TXT EXAMPLES The following example "/robots.txt" file specifies that no robots should visit any URL starting with "/cyberworld/map/" or "/tmp/": User-agent: * Disallow: /cyberworld/map/ # This is an infinite virtual URL space Disallow: /tmp/ # these will soon disappear This example "/robots.txt" file specifies that no robots should visit any URL starting with "/cyberworld/map/", except the robot called "cybermapper": User-agent: * Disallow: /cyberworld/map/ # This is an infinite virtual URL space # Cybermapper knows where to go. User-agent: cybermapper Disallow: This example indicates that no robots should visit this site further: # go away User-agent: * Disallow: / This is an example of a malformed robots.txt file. # robots.txt for ancientcastle.example.com # I've locked myself away. User-agent: * Disallow: / # The castle is your home now, so you can go anywhere you like. User-agent: Belle Disallow: /west-wing/ # except the west wing! # It's good to be the Prince... User-agent: Beast Disallow: This file is missing the required blank lines between records. However, the intention is clear. SEE ALSO
LWP::RobotUA, WWW::RobotRules::AnyDBM_File COPYRIGHT
Copyright 1995-2009, Gisle Aas Copyright 1995, Martijn Koster This library is free software; you can redistribute it and/or modify it under the same terms as Perl itself. perl v5.10.1 2011-03-13 WWW::RobotRules(3pm)
All times are GMT -4. The time now is 07:11 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy