Sponsored Content
Top Forums Shell Programming and Scripting Linear Interpolation of CSV Columnar Data Post 303038319 by hrrruser on Friday 30th of August 2019 09:53:44 AM
Old 08-30-2019
Thank you! This is exactly what I was looking for.

Quote:
Originally Posted by rdrtx1
try:
Code:
awk -F, '
   function interpolate(lat1, lon1, t1, lat2, lon2, t2, p) {
       for (i = 1; i <= (t2 - t1); i++) {
           printf "%.2f,%.2f,%.1f\n",
              lat1 + (t2 - t1) * (lat2 - lat1) / p,
              lon1 + (t2 - t1) * (lon2 - lon1) / p,
              t1 + i * (t2 - t1) / p
       }
   }
   NR >= 2 {interpolate(lat, lon, t, $1, $2, $3, ++p); p=0;}
   {lat = $1; lon = $2; t = $3; ++p}
   {print}
' file.in.csv > file.out.csv

 

10 More Discussions You Might Find Interesting

1. Programming

LInear Addresses

Hi all, Even after reading many explanation the question still haunting me what's the difference between physical and linear addresses.Can we directly access physical addresses .If not then paging circuitry would have ensure contiguous physical addresses regardless of any linear addresses but this... (2 Replies)
Discussion started by: joshighanshyam
2 Replies

2. Shell Programming and Scripting

Converting to columnar output

Hi All, I need some help in script or awk to create a textfile. I have a directory where two subdirectories exists say A and B. Now I need to write a ".txt" file with well arranged(space wraped) three columns in it with the data as follows: 1st column:Serial number(which will increment... (4 Replies)
Discussion started by: Sreejith_VK
4 Replies

3. Shell Programming and Scripting

Linear data to column data..script help seeked

Hello Take a look at following lines. This is giving me an o/p all in one array where as i want the column to be printed.How can i do it? e.g I am getting: 1575028616...... whereas i want 1 5750 28616 I am writing this small piece and trying to get this column o/p in a CSV. ... (1 Reply)
Discussion started by: ak835
1 Replies

4. Shell Programming and Scripting

Formatting Data - CSV

I want to check whether if any column data has any + , - , = prefixed to it then convert it in such a form that in excel its not read as formula. echo "$DATA" | awk 'BEGIN { OFS="," } -F" " {print $1,$2,$3,$4,$5,$6,$7,$8.$9,$10,$11,$12}' (4 Replies)
Discussion started by: dinjo_jo
4 Replies

5. Shell Programming and Scripting

File Comparison Columnar?

I'm looking to build up a process which would compare 2 files and show difference, the difference needs to be done in such a way that it also shows which column value is differing. So i think of this. Run a diff between the files , then someway find which columns have different values need to... (6 Replies)
Discussion started by: dinjo_jo
6 Replies

6. Programming

Linear hashing implementation in C language

Hi, I'm looking for linear hashing implementation in C language. Please help. PS: I have implement this on Ubuntu 10.04 Linux on 64 bit machine. (1 Reply)
Discussion started by: sajjar
1 Replies

7. Shell Programming and Scripting

Converting variable space width data into CSV data in bash

Hi All, I was wondering how I can convert each line in an input file where fields are separated by variable width spaces into a CSV file. Below is the scenario what I am looking for. My Input data in inputfile.txt 19 15657 15685 Sr2dReader 107.88 105.51... (4 Replies)
Discussion started by: vharsha
4 Replies

8. Programming

Find gaps in time data and replace missing time value and column 2 value by interpolation in awk

Dear all, I am kindly seeking assistance on the following issue. I am working with data that is sampled every 0.05 hours (that is 3 minutes intervals) here is a sample data from the file 5.00000 15.5030 5.05000 15.6680 5.10000 16.0100 5.15000 16.3450 5.20000 16.7120 5.25000... (4 Replies)
Discussion started by: malandisa
4 Replies

9. Shell Programming and Scripting

Compare 2 files of csv file and match column data and create a new csv file of them

Hi, I am newbie in shell script. I need your help to solve my problem. Firstly, I have 2 files of csv and i want to compare of the contents then the output will be written in a new csv file. File1: SourceFile,DateTimeOriginal /home/intannf/foto/IMG_0713.JPG,2015:02:17 11:14:07... (8 Replies)
Discussion started by: refrain
8 Replies

10. Programming

Joining Columnar heading from 2 lines

Hi, Below is the format of a report generated by a custom reporting solution. I opened the report in Notepad++ and junked data and values as in the image below. https://www.unix.com/attachment.php?attachmentid=7907&stc=1&d=1575507708 I want to convert the values to a PIPE delimited format as... (2 Replies)
Discussion started by: ramkrix
2 Replies
RDF::Trine::Exporter::CSV(3pm)				User Contributed Perl Documentation			    RDF::Trine::Exporter::CSV(3pm)

NAME
RDF::Trine::Exporter::CSV - Export RDF data to CSV VERSION
This document describes RDF::Trine::Exporter::CSV version 1.000 SYNOPSIS
use RDF::Trine::Exporter::CSV; DESCRIPTION
The RDF::Trine::Exporter::CSV class provides an API for serializing RDF data to CSV strings and files. METHODS
"new ( sep_char => $sep_char, quote => $bool )" Returns a new RDF::Trine::Exporter::CSV object. If $sep_char is provided, it is used as the separator character in CSV serialization, otherwise a comma (",") is used. "serialize_iterator_to_file ( $file, $iterator )" Serializes the bindings objects produced by $iterator, printing the results to the supplied filehandle "<$fh">. "serialize_iterator_to_string ( $iterator )" Serializes the bindings objects produced by $iterator, returning the result as a string. BUGS
Please report any bugs or feature requests to through the GitHub web interface at <https://github.com/kasei/perlrdf/issues>. AUTHOR
Gregory Todd Williams "<gwilliams@cpan.org>" COPYRIGHT
Copyright (c) 2006-2012 Gregory Todd Williams. This program is free software; you can redistribute it and/or modify it under the same terms as Perl itself. perl v5.14.2 2012-06-29 RDF::Trine::Exporter::CSV(3pm)
All times are GMT -4. The time now is 05:48 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy