I have a '~' delimited file of 6 - 7 million rows. Each row should contain 13 columns delimited by 12 ~'s. Where there are 13 tildes, the row needs to be removed. Each row contains alphanumeric data and occasionally a ~ ends up in a descriptive field and therefore acts as a delimiter, resulting in... (1 Reply)
I have a directory (and many sub dirs beneath) on AIX system, containing thousands of file. I'm looking to get a list of all directory containing "*.pdf" file.
I know basic syntax of find command, but it gives me list of all pdf files, which numbers in thousands. All I need to know is, which... (4 Replies)
I am trying to come up with a script that will search for selected files and then email them to me.
For example, say I have a directory that has the following files:
AA_doug.txt
AA_andy.txt
BB_john.txt
APPLE_mike.txt
GLOBE_ed.txt
GLOBE_tony.txt
TOTAL_carl.txt
what is the best way to... (2 Replies)
Assistance on work Use and complete the template provided. The entire template must be completed. If you don't, your post may be deleted!
1. The problem statement, all variables and given/known data:
Files stored in ... (1 Reply)
Using these strings as an example:
<a onclick="doShowCHys=1;ShowWindowN(0,'/daman/man.php?asv4=145148&playTogether=True',960,540,943437);return false;" title="">
<a onclick="doShowCHys=1;ShowWindowN(0,'/daman/man.php?asv4=1451486&playTogether=True',960,540,94343);return false;" title="">
<a... (12 Replies)
Find all files in the current directory only excluding hidden directories and files.
For the below command, though it's not deleting hidden files.. it is traversing through the hidden directories and listing normal which should be avoided.
`find . \( ! -name ".*" -prune \) -mtime +${n_days}... (7 Replies)
I was thinking something like this but it always gets rid of the file location.
grep -roh base. | wc -l
find . -type f -exec grep -o base {} \; | wc -l
Would this be a job for awk? Would I need to store the file locations in an array? (3 Replies)
Hello ALL,
need a BASH script who find file and send email with attachment.
I have 50 folders without sub directories in each generated files of different sizes but with a similar name Rp01.txt Rp02.txt Rp03.txt ...etc. Each directors bound by mail group, I need a script that goes as... (1 Reply)
I'm working on a bash script to move files from one location, to two. The first part of my challenge is intended to check a particular directory for contents (e.g. files or other items in it), if files exists, then send the list of names to a txt file and email me the text file. If files do not... (4 Replies)
Discussion started by: Nvizn
4 Replies
LEARN ABOUT DEBIAN
openscad-testrun
OPENSCAD-TESTRUN(1) General Commands Manual OPENSCAD-TESTRUN(1)NAME
openscad-testrun - set up and run the OpenSCAD test suite
SYNOPSIS
openscad-testrun [options]
DESCRIPTION
This manual page documents briefly the openscad-testrun command.
openscad-testrun is a script that sets up a directory in which the OpenSCAD test suite (implemented in ctest) can be run by creating sym-
links to the system locations of the input data, and runs it. The created directory is not removed, but will contain the test results in
addition to the symlinks.
It is required as the test suite in its original form expects all the test input and output data in relative locations, and the typical
user has no write access to where the data resides. Future changes to the test suite might make it more flexible, removing the need for
this script.
OPTIONS -d directory, --directory directory
Set up the tests in a directory called directory. By default, this is generated from the current date and time like openscad-
test-2012-02-15_13:37.
-n, --dry-run
Do not run the test suite, just set it up.
All additional arguments are passed on to ctest.
AUTHOR
The OpenSCAD test suite was written by Clifford Wolf, Marius Kintel, and others. The openscad-testrun script was written by chrysn.
This manual page was written by chrysn <chrysn@fsfe.org>, for the Debian project (and may be used by others).
2012-02-16 OPENSCAD-TESTRUN(1)