Sponsored Content
Full Discussion: Dataset Library for C?
Top Forums Programming Dataset Library for C? Post 302519981 by DGPickett on Thursday 5th of May 2011 01:56:53 PM
Old 05-05-2011
I'd start with unixODBC. You might start with their text file drivers, and later write a driver for zip files of CSV or XML, where you can scan files using wildcards against internal file and dir names with wildcards and extract files to stdout on a popen().
 

10 More Discussions You Might Find Interesting

1. UNIX for Dummies Questions & Answers

Accessing Mainframe Dataset

Hi May I know is there a way to read/copy a mainframe (IBM OS/390) dataset (sequential file) into a UNIX directory? Thank you for your time. IcyGuava (4 Replies)
Discussion started by: IcyGuava
4 Replies

2. Shell Programming and Scripting

Numbers of records in SAS dataset

I'm declaring a variable within a Korn shell to represent the total number of records in a SAS dataset and could use a little help with the syntax. This is what I have thus far: #!/usr/bin/ksh RecCount = `sas -x "select count(*) from /users/abc/123/sas_dataset.sas7bdat"` (2 Replies)
Discussion started by: sasaliasim
2 Replies

3. Programming

Finding number of records in SAS dataset

I am running the following Korn shell script: #!/usr/bin/ksh num_records=`sas "select count(*) from /users/abc/123/sasdata.sas7bdat"` echo "$num_records" The script keeps returning an invalid file error even though I am certain that the file really exists. Does anyone see anything wrong... (1 Reply)
Discussion started by: sasaliasim
1 Replies

4. Shell Programming and Scripting

Normalize a dataset with AWK

Hello everyone, i have to normalize this dataset (with 20.000 rows): 2,4,4,3,2,7,8,2,9,11,7,7,1,8,5,6 4,7,5,5,5,5,9,6,4,8,7,9,2,9,7,10 7,10,8,7,4,8,8,5,10,11,2,8,2,5,5,10 4,9,5,7,4,7,7,13,1,7,6,8,3,8,0,8,8 6,7,8,5,4,7,6,3,7,10,7,9,3,8,3,7,8 in this form:... (1 Reply)
Discussion started by: [raven]
1 Replies

5. Shell Programming and Scripting

Computing dataset for a specific record

Hello everybody, I want to compute a data file in awk. I am new in awk and I need your help. The data file has the following fields. It has thousands of records. Col1 Col2 Col3 Col4 Col5 0.85 0.07 Fre 42:86 25 0.73 0.03 frp 21:10 28 0.64... (12 Replies)
Discussion started by: ubeejani
12 Replies

6. Shell Programming and Scripting

How to extract a subset from a huge dataset

Hi, All I have a huge file which has 450G. Its tab-delimited format is as below x1 A 50020 1 x1 B 50021 8 x1 C 50022 9 x1 A 50023 10 x2 D 50024 5 x2 C 50025 7 x2 F 50026 8 x2 N 50027 1 : : Now, I want to extract a subset from this file. In this subset, column 1 is x10, column 2 is... (3 Replies)
Discussion started by: cliffyiu
3 Replies

7. Solaris

flarecreate for zfs root dataset and ignore multiple dataset

Hi All, I want to write a script to create flar images on multiple servers. In non zfs filesystem I am using -X option to refer a file to exclude mounts on different servers. but on ZFS -X option is not working. I want multiple mounts to be ignore on ZFS base system during flarecreate. I... (0 Replies)
Discussion started by: uxravi
0 Replies

8. Solaris

ZFS - Dataset / pool name are the same...cannot destroy

I messed up my pool by doing zfs send...recive So I got the following : zpool list NAME SIZE ALLOC FREE CAP DEDUP HEALTH ALTROOT rpool 928G 17.3G 911G 1% 1.00x ONLINE - tank1 928G 35.8G 892G 3% 1.00x ONLINE - So I have "tank1" pool. zfs get all... (8 Replies)
Discussion started by: eladgrs
8 Replies

9. UNIX for Advanced & Expert Users

How to extract subset file from dataset?

Hello I have a data set which looks like this : progeny sire dam gender 12 1 3 M 13 2 4 F 14 2 5 F 15 6 5 ... (13 Replies)
Discussion started by: sajmar
13 Replies

10. UNIX for Advanced & Expert Users

SAS dataset to CSV

Hi Guys, Is there a way to export a sas file i.e .sas7bdat file to .csv file with header and data using unix. I dont want to use SAS program instead using unix tool or unix scripting is it possible ? (25 Replies)
Discussion started by: Master_Mind
25 Replies
RECEIVE(1)						      General Commands Manual							RECEIVE(1)

NAME
receive - receive files from the sendfile spool SYNOPSIS
receive [ -d ] [ -r ] [ -k ] [ -P ] [ -S ] [ -Z spool ] [ -q ] [ -ffrom ] file [...] receive -n [ -d ] [ -r ] [ -k ] [ -P ] [ -S ] [ -Z spool ] [ -q ] file-number [...] receive [ -s ] [ -l ] [ -L ] [ -R ] [ -ffrom ] receive -b user[@host] [ -k ] [ -f"from" ] file [...] receive -b user[@host] [ -k ] [ -f"from" ] -n file-number [...] receive -b user[@host] [ -k ] -a DESCRIPTION
receive files from the sendfile spool which has been sent to you. If there is already a file with the same name you will be prompted for overwriting or renaming. Allowed wildcards in file names are: * ? [abc] [^abc] CAUTION: you have to put wildcards and other special characters in '' quotes to hide them for interpretation by your shell. OPTIONS
-n receive file number(s) -d delete instead of receive -a receive (or delete or bounce) all files -r rename before receiving -k keep files in spool after receiving -P pipe files to stdout -S receive only pgp-signed files -s list files in short format -l list files -L list files and look inside archives, too -R renumber files in spool -b bounce (forward) files to another recipient -q quiet mode: no questions asked -fuser all actions refer only to files from this user -Z spool specify an alternate spool directory EXAMPLES
receive -L list all files in long format. receive 'blubb*' receive all files starting with string "blubb". receive -daf microsoft.com delete all files from microsoft.com sites. receive -b framstag@bofh '*.jpg' bounce all *.jpg-files to framstag@bofh. FILES
/var/spool/sendfile The sendfile spool directory. /var/spool/sendfile/$USER/log A log of the last transfers. /etc/sendfile.deny Users which are not allowed to receive files or messages (set by root). SEE ALSO
sendfile(1). AUTHOR
Ulli Horlacher - framstag@rus.uni-stuttgart.de 3rd Berkeley Distribution RECEIVE(1)
All times are GMT -4. The time now is 10:38 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy