Sponsored Content
Operating Systems SCO Need advice: Copying large CSV report files off SCO system Post 302538146 by magnetman on Tuesday 12th of July 2011 12:38:13 AM
Old 07-12-2011
Need advice: Copying large CSV report files off SCO system

I have a SCO Unix server from 1999 running SCO 5.0.5 and some ancient accounting software called Real World

A report writer program on the system is used to generate CSV files from accounting that we write with DOSCOPY commands to 3.5" floppies

In the next 60 days we will be decommissioning this system and I need to generate many CSV files that are too large for the floppies

The system has USB ports but I've been told that the SCO 5.0.5 does not support USB (It did in the next version of SCO?)

There is no external SCSI port on the box.

How can I hook up a large DOS formatted drive of any sort to write these files to that can then be plugged to a Windows machine to access the CSV files?

Any help or advice would be greatly appreciated.

Thanks
This User Gave Thanks to magnetman For This Post:
 

10 More Discussions You Might Find Interesting

1. UNIX for Dummies Questions & Answers

copying .profile files to a new server (SCO)

Hello Intelligent Life Forms (I hope) This should be a snap for some of you. I need to copy the /usr .profiles from 1 sco system to another. Migrating to a new server. I've tried a recursive copy to the target system with a NFS mount point from the source. Wouldn't do it permissions... (2 Replies)
Discussion started by: sighbrrguy
2 Replies

2. Web Development

Content Management System for uploading large files

Hi everybody, I am currently trying to develop a simple content management system where I have an internal website for my users to upload large files onto the server. The site is password protected and my users won't be trying to hack into the system so security is a non-factor (as least for... (3 Replies)
Discussion started by: z1dane
3 Replies

3. Shell Programming and Scripting

Sed or awk script to remove text / or perform calculations from large CSV files

I have a large CSV files (e.g. 2 million records) and am hoping to do one of two things. I have been trying to use awk and sed but am a newbie and can't figure out how to get it to work. Any help you could offer would be greatly appreciated - I'm stuck trying to remove the colon and wildcards in... (6 Replies)
Discussion started by: metronomadic
6 Replies

4. Programming

Is there a system call other than 'open' for opening very large files?

Dear all, Inside a C program, I want to open a very big file (about 12 GB) in order to read its content. Here is the code: /* argv contains the path to the file. */ inputFileDescriptor = open(argv, O_RDONLY); if (inputFileDescriptor < 0) { ... (6 Replies)
Discussion started by: dariyoosh
6 Replies

5. Shell Programming and Scripting

Copying of large files fail

Hi, I have a process which duplicates files for different environments. As the files arrive, my script (korn shell) makes copies of them (giving a unique name) and then renames the original file so that my process won't get triggered again. I don't like it either, but it's what we were told to... (4 Replies)
Discussion started by: GoldenEye4ever
4 Replies

6. Shell Programming and Scripting

Copying multiple csv files

Hi, I have mutiple csv files at server1 at /apps/test/data. I needed a script that would copy these csv files from server1 at /usr/data, put them in server2,archive the earlier files that were present in server2 before removing those already present. Kindly help. (2 Replies)
Discussion started by: Alok Ranjan
2 Replies

7. Red Hat

Advice regarding filesystems handling large number of files

Hi All, I have a CentOS operating system installed. I work with really huge number of files which are not only huge in number but some of them really huge in size. Minimum number of files could be 1 million to 2 million in one directory itself. Some of the files are even several Gigabytes in... (2 Replies)
Discussion started by: shoaibjameel123
2 Replies

8. Shell Programming and Scripting

Merging Very large CSV files in Unix

Hi, I have two very large CSV files, which I want to merge (equi-join) based on a key (column). One of the file (say F1) would have ~30 MM records and 700 columns. The other file (~f2) would have same # of records and lesser columns (say 50). I want to create an output file joining on a... (3 Replies)
Discussion started by: student_007
3 Replies

9. Shell Programming and Scripting

Comparing two large unsorted csv files

Hi All, My requirement is to write a shell script to compare two large csv files. I've created sample files for explaining my problem i.e., a.csv and b.csv contents of files: ----------------- a.csv ------ Type,Memory (Kb),Location HD,Size (Mb),Serial # XT,640,D402,0,MG0010... (2 Replies)
Discussion started by: vasavi
2 Replies

10. Shell Programming and Scripting

Copying large files in a bash script stops execution

Hello, I'm new to this forum and like to first of all say hello to everyone. I've got a really annoying problem at the moment. I'm trying to rsync some files (about 200MB with one file of 120MB) from a Raspberry PI with raspbian to a debian server via rsync. This procedure is stored in a... (3 Replies)
Discussion started by: wex_storm
3 Replies
Image::ExifTool::Import(3pm)				User Contributed Perl Documentation			      Image::ExifTool::Import(3pm)

NAME
Image::ExifTool::Import - Import CSV and JSON database files SYNOPSIS
use Image::ExifTool::Import qw(ReadCSV ReadJSON); $err = ReadCSV($csvFile, \%database); $err = ReadJSON($jsonfile, \%database); DESCRIPTION
This module contains routines for importing tag information from CSV (Comma Separated Value) and JSON (JavaScript Object Notation) database files. EXPORTS
Exports nothing by default, but ReadCSV and ReadJSON may be exported. METHODS
ReadCSV / ReadJSON Read CSV or JSON file into a database hash. Inputs: 0) CSV file name. 1) Hash reference for database object. 2) Optional flag to set '-' values to undef in the database. (Used for deleting tags.) 3) [ReadJSON only] Optional character set for converting Unicode escape sequences in strings. Defaults to "UTF8". See the ExifTool Charset option for a list of valid settings. Return Value: These functions return an error string, or undef on success and populate the database hash with entries from the CSV or JSON file. Entries are keyed based on the SourceFile column of the CSV or JSON information, and are stored as hash lookups of tag name/value for each SourceFile. AUTHOR
Copyright 2003-2011, Phil Harvey (phil at owl.phy.queensu.ca) This library is free software; you can redistribute it and/or modify it under the same terms as Perl itself. SEE ALSO
Image::ExifTool(3pm) perl v5.12.4 2011-03-12 Image::ExifTool::Import(3pm)
All times are GMT -4. The time now is 05:11 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy