07-30-2009
it depends on your file size...
10 More Discussions You Might Find Interesting
1. Shell Programming and Scripting
Hi all you enlightened unix people,
I've been trying to execute a perl script that contains the following line within backticks:
`grep -f patternfile.txt otherfile.txt`;It takes normally 2 minutes to execute this command from the bash shell by hand.
I noticed that when i run this command... (2 Replies)
Discussion started by: silverlocket
2 Replies
2. Shell Programming and Scripting
Hi,
A datafile containing lines such as below needs to be split:
500000000000932491683600000000000000000000000000016800000GS0000000000932491683600*HOME
I need to get the 2-5, 11-20, and 35-40 characters and I can do it via cut command.
cut -c 2-5 file > temp1.txt
cut -c 11-20 file >... (9 Replies)
Discussion started by: daytripper1021
9 Replies
3. Shell Programming and Scripting
I'm sorting files from a source directory by size into 4 categories then copying them into 4 corresponding folders, just wondering if there's a faster/better/more_elegant way to do this:
find /home/user/sourcefiles -type f -size -400000k -exec /bin/cp -uv {} /home/user/medfiles/ \;
find... (0 Replies)
Discussion started by: unclecameron
0 Replies
4. HP-UX
we have 30 GB files on our filesystem which we need to copy daily to 25 location on the same machine (but different filesystem).
cp is taking 20 min to do the copy and we have 5 different thread doing the copy.
so in all its taking around 2 hr and we need to reduce it.
Is there any... (9 Replies)
Discussion started by: shipra_31
9 Replies
5. Shell Programming and Scripting
Hi,
I have a script below for extracting xml from a file.
for i in *.txt
do
echo $i
awk '/<.*/ , /.*<\/.*>/' "$i" | tr -d '\n'
echo -ne '\n'
done
.
I read about using multi threading to speed up the script.
I do not know much about it but read it on this forum.
Is it a... (21 Replies)
Discussion started by: chetan.c
21 Replies
6. Shell Programming and Scripting
awk "/May 23, 2012 /,0" /var/tmp/datafile
the above command pulls out information in the datafile. the information it pulls is from the date specified to the end of the file.
now, how can i make this faster if the datafile is huge? even if it wasn't huge, i feel there's a better/faster way to... (8 Replies)
Discussion started by: SkySmart
8 Replies
7. UNIX for Dummies Questions & Answers
Hello guys,
I'm cleaning out big XML files (we're talking about 1GB at least), most of them contain words written in a non-latin alphabet.
The command I'm using is so slow it's not even funny:
cat $1 | sed -e :a -e 's/<*>//g;/</N;//ba;s/</ /g;s/>/... (4 Replies)
Discussion started by: bobylapointe
4 Replies
8. Shell Programming and Scripting
Hi,
I have a large number of input files with two columns of numbers.
For example:
83 1453
99 3255
99 8482
99 7372
83 175
I only wish to retain lines where the numbers fullfil two requirements. E.g:
=83
1000<=<=2000
To do this I use the following... (10 Replies)
Discussion started by: s052866
10 Replies
9. Shell Programming and Scripting
Good evening
Im new at unix shell scripting and im planning to script a shell that removes headers for about 120 files in a directory and each file contains about 200000
lines in average.
i know i will loop files to process each one and ive found in this great forum different solutions... (5 Replies)
Discussion started by: alexcol
5 Replies
10. Shell Programming and Scripting
I have the below command which is referring a large file and it is taking 3 hours to run. Can something be done to make this command faster.
awk -F ',' '{OFS=","}{ if ($13 == "9999") print $1,$2,$3,$4,$5,$6,$7,$8,$9,$10,$11,$12 }' ${NLAP_TEMP}/hist1.out|sort -T ${NLAP_TEMP} |uniq>... (13 Replies)
Discussion started by: Peu Mukherjee
13 Replies
LEARN ABOUT MOJAVE
mpscnnlossdatadescriptor
MPSCNNLossDataDescriptor(3) MetalPerformanceShaders.framework MPSCNNLossDataDescriptor(3)
NAME
MPSCNNLossDataDescriptor
SYNOPSIS
#import <MPSCNNLoss.h>
Inherits NSObject, and <NSCopying>.
Instance Methods
(nonnull instancetype) - init
Class Methods
(nullable MPSCNNLossDataDescriptor *) + cnnLossDataDescriptorWithData:layout:size:
Properties
MPSDataLayout layout
MTLSize size
NSUInteger bytesPerRow
NSUInteger bytesPerImage
Detailed Description
This depends on Metal.framework. The MPSCNNLossDataDescriptor specifies a loss data descriptor. The same descriptor can be used to
initialize both the labels and the optional weights data.
Method Documentation
+ (nullable MPSCNNLossDataDescriptor*) cnnLossDataDescriptorWithData: (NSData *__nonnull) data(MPSDataLayout) layout(MTLSize) size
Make a descriptor loss data. The bytesPerRow and bytesPerImage are automatically calculated assuming a dense array. If it is not a dense
array, adjust bytesPerRow and bytesPerImage to the right value by changing properties.
Parameters:
data The per-element loss data. The data must be in floating point format.
layout The data layout of loss data.
size The size of loss data.
Returns:
A valid MPSCNNLossDataDescriptor object or nil, if failure.
- (nonnull instancetype) init
Property Documentation
- bytesPerImage [read], [write], [nonatomic], [assign]
Slice bytes of loss data. This parameter specifies the slice bytes of loss data.
- bytesPerRow [read], [write], [nonatomic], [assign]
Row bytes of loss data. This parameter specifies the row bytes of loss data.
- layout [read], [nonatomic], [assign]
Data layout of loss data. See MPSImage.h for more information. This parameter specifies the layout of loss data.
- size [read], [nonatomic], [assign]
Size of loss data: (width, height, feature channels}. This parameter specifies the size of loss data.
Author
Generated automatically by Doxygen for MetalPerformanceShaders.framework from the source code.
Version MetalPerformanceShaders-100 Thu Feb 8 2018 MPSCNNLossDataDescriptor(3)