04-06-2011
Quote:
Originally Posted by
Mr.AIX
I have tried to use your suggested command but it got hanged after some time
You haven't said so until now. If so, what exactly was the error? The method i described (tar) has worked for me countless times, also on such amounts of data and even more.
Using "backup" and "restore", "cpio" or "savevg" will equally work and probably at roughly the same speed as "tar".
One thing though: you can't cancel a job after some time and expect it to have finished the task - if you have to cancel it that means it was still running and if it was running it means it hasn't finished what it was doing.
I hope this helps.
bakunin
10 More Discussions You Might Find Interesting
1. UNIX for Dummies Questions & Answers
Hi,
As per my requirement, I need to take difference between two big files(around 6.5 GB) and get the difference to a output file without any line numbers or '<' or '>' in front of each new line.
As DIFF command wont work for big files, i tried to use BDIFF instead.
I am getting incorrect... (13 Replies)
Discussion started by: pyaranoid
13 Replies
2. UNIX for Advanced & Expert Users
Hi , i need a fast way to delete duplicates entrys from very huge files ( >2 Gbs ) , these files are in plain text.
I tried all the usual methods ( awk / sort /uniq / sed /grep .. ) but it always ended with the same result (memory core dump)
In using HP-UX large servers.
Any advice will... (8 Replies)
Discussion started by: Klashxx
8 Replies
3. UNIX for Dummies Questions & Answers
Dear All,
I am working with windoes OS but remote a linux machine. I wonder the way to copy an paste some part of a huge file in linux machine.
the contain of file like as follow:
...
dump annealling all custom 10 anneal_*.dat id type x y z q
timestep 0.02
run 200000
Memory... (2 Replies)
Discussion started by: ariesto
2 Replies
4. Shell Programming and Scripting
Hi, all:
I've got two folders, say, "folder1" and "folder2".
Under each, there are thousands of files.
It's quite obvious that there are some files missing in each. I just would like to find them. I believe this can be done by "diff" command.
However, if I change the above question a... (1 Reply)
Discussion started by: jiapei100
1 Replies
5. UNIX for Dummies Questions & Answers
Hi All,
HP-UX dev4 B.11.11 U 9000/800 3251073457
I need to copy huge data from windows text file to vi editor. when I tried copy huge data, the format of data is not preserverd and appered to scatterd through the vi, something like give below. Please let me know, how can I correct this?
... (18 Replies)
Discussion started by: alok.behria
18 Replies
6. Shell Programming and Scripting
Hi
I have a shell script to copy a pattern of files from Linux to Windows Filesystem.
When i execute the below command
cp -av TOUT_05-02-13* Windows/Folder
`TOUT_05-02-13-19:02:37.tar.gz' -> `Windows/Folder/SYSOUT_05-02-13-19:02:37.tar.gz'
cp: cannot create regular file... (5 Replies)
Discussion started by: rakeshkumar
5 Replies
7. Shell Programming and Scripting
Hi Friends !!
I am facing a hash total issue while performing over a set of files of huge volume:
Command used:
tail -n +2 <File_Name> |nawk -F"|" -v '%.2f' qq='"' '{gsub(qq,"");sa+=($156<0)?-$156:$156}END{print sa}' OFMT='%.5f'
Pipe delimited file and 156 column is for hash totalling.... (14 Replies)
Discussion started by: Ravichander
14 Replies
8. Solaris
Dear Experts,
I would like to know what's the best method for copy data around 3 mio (spread in a hundred folders, size each file around 1kb) between 2 servers?
I already tried using Rsync and tar command. But using these command is too long.
Please advice.
Thanks
Edy (11 Replies)
Discussion started by: edydsuranta
11 Replies
9. Solaris
Gents,
I have NAS File System mounted in Solaris as \Sysapp with size 8 TB
the problem once the backup stared it is impacting the performance of the OS.
Do you have any idea how to can we backup this FS with fast scenario without impacting the OS.
Backup type : Netbackup (3 Replies)
Discussion started by: AbuAliiiiiiiiii
3 Replies
10. Solaris
Gents
I have huge NAS File System as /sys with size 10 TB and I want to Split each 1TB in spirit File System to be mounted in the server.
How to can I do that without changing anything in the source.
Please your support. (1 Reply)
Discussion started by: AbuAliiiiiiiiii
1 Replies
LEARN ABOUT MOJAVE
cancel
cancel(1) Apple Inc. cancel(1)
NAME
cancel - cancel jobs
SYNOPSIS
cancel [ -E ] [ -U username ] [ -a ] [ -h hostname[:port] ] [ -u username ] [ -x ] [ id ] [ destination ] [ destination-id ]
DESCRIPTION
The cancel command cancels print jobs. If no destination or id is specified, the currently printing job on the default destination is can-
celed.
OPTIONS
The following options are recognized by cancel:
-a Cancel all jobs on the named destination, or all jobs on all destinations if none is provided.
-E Forces encryption when connecting to the server.
-h hostname[:port]
Specifies an alternate server.
-U username
Specifies the username to use when connecting to the server.
-u username
Cancels jobs owned by username.
-x Deletes job data files in addition to canceling.
CONFORMING TO
Unlike the System V printing system, CUPS allows printer names to contain any printable character except SPACE, TAB, "/", or "#". Also,
printer and class names are not case-sensitive.
EXAMPLES
Cancel the current print job:
cancel
Cancel job "myprinter-42":
cancel myprinter-42
Cancel all jobs:
cancel -a
NOTES
Administrators wishing to prevent unauthorized cancellation of jobs via the -u option should require authentication for Cancel-Jobs opera-
tions in cupsd.conf(5).
SEE ALSO
cupsd.conf(5), lp(1), lpmove(8), lpstat(1), CUPS Online Help (http://localhost:631/help)
COPYRIGHT
Copyright (C) 2007-2017 by Apple Inc.
15 April 2014 CUPS cancel(1)