Sponsored Content
Top Forums Shell Programming and Scripting how to remove hacking code from multiple files Post 302709565 by Corona688 on Wednesday 3rd of October 2012 11:07:14 AM
Old 10-03-2012
Yeah, restore from backup or re-install. Would you ever really trust those files again? I wouldn't.
 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

read list of filenames from text file and remove these files in multiple directories

I have a large list of filenames from an Excel sheet, which I then translate into a simple text file. I'd like to use this list, which contains various file extensions , to archive these files and then remove them recursively through multiple directories and subdirectories. So far, it looks like... (5 Replies)
Discussion started by: fxvisions
5 Replies

2. Shell Programming and Scripting

How to remove certain lines in multiple txt files?

Hi , I have this type of files:- BGH.28OCT2008.00000001.433155.001 BGH.28OCT2008.00000002.1552361.001 BGH.28OCT2008.00000003.1438355.001 BGH.28OCT2008.00000004.1562602.001 Inside them contains the below: 5Discounts 6P150 - Max Total Usage RM150|-221.00 P150 EPP - Talktime RM150... (5 Replies)
Discussion started by: olloong
5 Replies

3. UNIX for Dummies Questions & Answers

Using AWK: Extract data from multiple files and output to multiple new files

Hi, I'd like to process multiple files. For example: file1.txt file2.txt file3.txt Each file contains several lines of data. I want to extract a piece of data and output it to a new file. file1.txt ----> newfile1.txt file2.txt ----> newfile2.txt file3.txt ----> newfile3.txt Here is... (3 Replies)
Discussion started by: Liverpaul09
3 Replies

4. Shell Programming and Scripting

To remove multiple files in FTP

We have a files in FTP server..... after getting the files from FTP by mget *.* i hav to remove all files (multiple files) at once... is there any command to delete multiple files at once (2 Replies)
Discussion started by: nani1984
2 Replies

5. UNIX for Dummies Questions & Answers

How to remove characters from multiple .txt files

Friends, I want to remove charecters from multiple .txt files. Foe example : In this .txt files there are many "ctrl m" present in last of each line in one .txt file. I want to remove "ctrl m" from each line from all .txt files. Need your help regarding this. (4 Replies)
Discussion started by: meetsubhas
4 Replies

6. Shell Programming and Scripting

Remove java code from multiple files

Hello, We have a client who has had an FTP injection attack on their account. Over 600 files have this code added to the files: <script>var t="";var... (10 Replies)
Discussion started by: dhasbro
10 Replies

7. Shell Programming and Scripting

Code to remove files when corresponding file doesnt exist isnt working.

I am trying to add some code to the begging of a script so that it will remove all the .transcript files, when their is no coressponding .wav file. But it doesnt work. This is the code I have added: for transcriptfile in `$voicemaildir/*.transcript`; do wavfile=`echo $transcriptfile | cut -d'.'... (2 Replies)
Discussion started by: ghurty
2 Replies

8. Shell Programming and Scripting

How to remove hidden backslash in multiple files?

Hi I have around 300 files in a folder. When I type ls -l I see the following Mouse.chr10_+_:101862321-101863928.maf Mouse.chr10_+_:101862322-101863928.maf Mouse.chr10_+_:101862323-101863928.maf But when I run my scripts, they couldn't recognise the filename because of hidden backslash like... (5 Replies)
Discussion started by: quincyjones
5 Replies

9. Shell Programming and Scripting

[Solved] How to remove multiple files?

Hi Gurus, I have below files in one directory. the file name has date and time portion which is exactly the file be created. I need keep only lasted created file which is abc_20140101_1550 and remove rest of the file. abc_20140101_1300 abc_20140101_1200 abc_20140101_1400 abc_20140101_1500... (2 Replies)
Discussion started by: ken6503
2 Replies

10. UNIX for Beginners Questions & Answers

How to implement a simple command/code for multiple files?

I have been extracting a row, based on multiple key word from a xls/csv file, by using the following command. I have to implement the same for multiple xls/csv files, therefore please help me to do the same. awk ' { tbp=0 if ($0 ~ keyword1 && k1 == 0) { tbp=1; k1++ } if ($0 ~ keyword2... (2 Replies)
Discussion started by: dineshkumarsrk
2 Replies
bup-margin(1)						      General Commands Manual						     bup-margin(1)

NAME
bup-margin - figure out your deduplication safety margin SYNOPSIS
bup margin [options...] DESCRIPTION
bup margin iterates through all objects in your bup repository, calculating the largest number of prefix bits shared between any two entries. This number, n, identifies the longest subset of SHA-1 you could use and still encounter a collision between your object ids. For example, one system that was tested had a collection of 11 million objects (70 GB), and bup margin returned 45. That means a 46-bit hash would be sufficient to avoid all collisions among that set of objects; each object in that repository could be uniquely identified by its first 46 bits. The number of bits needed seems to increase by about 1 or 2 for every doubling of the number of objects. Since SHA-1 hashes have 160 bits, that leaves 115 bits of margin. Of course, because SHA-1 hashes are essentially random, it's theoretically possible to use many more bits with far fewer objects. If you're paranoid about the possibility of SHA-1 collisions, you can monitor your repository by running bup margin occasionally to see if you're getting dangerously close to 160 bits. OPTIONS
--predict Guess the offset into each index file where a particular object will appear, and report the maximum deviation of the correct answer from the guess. This is potentially useful for tuning an interpolation search algorithm. --ignore-midx don't use .midx files, use only .idx files. This is only really useful when used with --predict. EXAMPLE
$ bup margin Reading indexes: 100.00% (1612581/1612581), done. 40 40 matching prefix bits 1.94 bits per doubling 120 bits (61.86 doublings) remaining 4.19338e+18 times larger is possible Everyone on earth could have 625878182 data sets like yours, all in one repository, and we would expect 1 object collision. $ bup margin --predict PackIdxList: using 1 index. Reading indexes: 100.00% (1612581/1612581), done. 915 of 1612581 (0.057%) SEE ALSO
bup-midx(1), bup-save(1) BUP
Part of the bup(1) suite. AUTHORS
Avery Pennarun <apenwarr@gmail.com>. Bup unknown- bup-margin(1)
All times are GMT -4. The time now is 07:51 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy