I need to write a script that reads through an input .txt file and replaces the end value with the end value of the next line for lines that have distance <=4000. The first label line is not actually in the input. In the below example, 3217 is the distance from the end of the first line to the... (12 Replies)
Hi,
Is there any way to merge two lines based on specific occurance of a character in a file.
I am having a flat file which contains multiple records.
Each row in the file should contain specified number of delimiter.
For a particular row , if the delimiter count is not matched with... (2 Replies)
Hi,
I want to merge the lines starting with a comma symbol with the previous line of the file.
Input :
cat file.txt
name1,name2
,name3,name4
emp1,emp2,emp3
,emp4
,emp5
user1,user2
,user3
Output
name1,name2,name3,name4
emp1,emp2,emp3,emp4,emp5 (9 Replies)
hi,
I have a file as below:
Name: some_name
Date: some_date
Function Name: <some_function_name(jjjjjjjjj,
fjddddd, gggg, ggg)>
Changes:<Change A
more of change A>
Name: some_name
Date: some_date
Function Name: some_function_nameB(jjjjjjjjj,
fjddddd, gggg, ggg)
Changes:Change B... (15 Replies)
Hi ,
I'm looking for a way to merge two lines only for a given pattern / condition.
Input :
abcd/dad + -49.201 2.09 -49.5 34 ewrew rewtre *
fdsgfds/dsgf/sdfdsfasdd +
-4.30 0.62 -49.5 45 sdfdsf cvbbv *
sdfds/retret/asdsaddsa +
... (1 Reply)
Hi everyone,
I have two files (A and B) and want to combine them to one by always taking 10 rows from file A and subsequently 6 lines from file B. This process shall be repeated 40 times (file A = 400 lines; file B = 240 lines).
Does anybody have an idea how to do that using perl, awk or sed?... (6 Replies)
Thanks it worked for me. I have one more question on top of that. We had few records which were splitted in 2 lines instead of one. Now i identified those lines. The file is too big to open via vi and edit it. How can i do it without opening the file.
Suppose, I want line number 1001 & 1002 to... (2 Replies)
I have been working of this script for a very long time and I have searched the internet for direction but I am stuck here.
I have about 3000 files with two columns each. The length of each file is 50000. Each of these files is named this way b.4, b.5, b.6, b.7, b.8, b.9, b.10, b.11, b.12... (10 Replies)
Hello,
I have a file with few lines starting with a digit (1-5 only ) followed by a dot (.). Remaining all the lines to be merged with its previous numbered lines. Merging must be done with a space.
E.g.,
Source file:
3. abc def
xyz
5. pqr mno
def
4. jkl uvw
7. ghi
1. abc xyz
6. mno... (4 Replies)
Hello all,
I have a large csv file where there are four types of rows I need to merge into one row per person, where there is a column for each possible code / type of row, even if that code/row isn't there for that person.
In the csv, a person may be listed from one to four times... (9 Replies)
Discussion started by: RalphNY
9 Replies
LEARN ABOUT PHP
geoip_country_code_by_name
GEOIP_COUNTRY_CODE_BY_NAME(3) 1 GEOIP_COUNTRY_CODE_BY_NAME(3)geoip_country_code_by_name - Get the two letter country codeSYNOPSIS
string geoip_country_code_by_name (string $hostname)
DESCRIPTION
The geoip_country_code_by_name(3) function will return the two letter country code corresponding to a hostname or an IP address.
PARAMETERS
o $hostname
- The hostname or IP address whose location is to be looked-up.
RETURN VALUES
Returns the two letter ISO country code on success, or FALSE if the address cannot be found in the database.
EXAMPLES
Example #1
A geoip_country_code_by_name(3) example
This will print where the host example.com is located.
<?php
$country = geoip_country_code_by_name('www.example.com');
if ($country) {
echo 'This host is located in: ' . $country;
}
?>
The above example will output:
This host is located in: US
NOTES
Caution
Please see http://www.maxmind.com/en/iso3166 for a complete list of possible return values, including special codes.
SEE ALSO geoip_country_code3_by_name(3), geoip_country_name_by_name(3).
PHP Documentation Group GEOIP_COUNTRY_CODE_BY_NAME(3)