Sponsored Content
Top Forums UNIX for Dummies Questions & Answers Recover data from 2 files then combine Post 302184723 by saint65 on Saturday 12th of April 2008 06:25:15 PM
Old 04-12-2008
Question Recover data from 2 files then combine

Using dd or similar tools to recover data from 2 damaged cdroms, I need a way to then combine the 2 files, 1 from each cd, and make a good file: this all result from finding that certain cd's tops scratch easily even when using the "proper" cd markers, hence making the file useless, however the back up copy is the same-I am going to assume that the damage to the files are in different "bits", and attempt to recover what i can. any ideas? files are 26megs, they were originally produced in corel drawing program. I can see the holes in the cd tops.
thanks in advance.
 

8 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

How to combine data files using for loop

Hi, I have 5 files basically;namely file1.txt situated each at folder A to E respectively. I would like to extract out third column from each of these file1.txt from folder A to folder E. Also, I wanted to extract the first and second column which are common. In other words, e.g ... (6 Replies)
Discussion started by: ahjiefreak
6 Replies

2. Shell Programming and Scripting

combine data of 2 files by variable

my first post ... please be gentle. I have been working on a script to get info out of mysql. Its a support ticket system database OTRS. I can write the subject of open tickets to a text file with a unique user id. I also have a text file with the unique user id, username and email adres. I... (11 Replies)
Discussion started by: dicenl
11 Replies

3. Red Hat

Recover RAID data

Hello, Given a scenario, I have 2 HDD which were used on the server with software RAID. Now, the original server crashed and I have attached these 2 HDD to the new server. Any possible chances to recover the data from any of this HDD ? I want to mount /dev/sdb3 on some folder.. Output of... (3 Replies)
Discussion started by: chinmay
3 Replies

4. Shell Programming and Scripting

get data from files combine them to a file

hi people; this is my file1.txt:192.168.1.1 192.168.1.2 192.168.1.3 192.168.1.4 ... this is my file2.txt:portnames usernames maxusercap ... i want to write to file3.txt:l ./getports 192.168.1.1 'get all;l+;get . portnames;l-' l ./getports 192.168.1.1 'get all;l+;get . usernames;l-'... (4 Replies)
Discussion started by: gc_sw
4 Replies

5. Shell Programming and Scripting

How to combine the data of files?

I have a main file as follows aaa 3/2 = 1.5 aba 55+6 = 61 aca 67+8 = 75 hjk 3+3 = 67 ghd 66+30 = 96 ghj 99-3 = 96 ffg 67+3 = 70 I have 4 sub files named sub1, sub2, sub3, sub4 content of sub1 aaa 23+5 = 28 hjk 45+6 = 51 ghd 40-20 = 20 ... (2 Replies)
Discussion started by: jackevan
2 Replies

6. Shell Programming and Scripting

Combine/omit data from 2 files

i made a script on my own. this is for the inventory to all of my AWS servers, and i run it to all of my servers to get the hostname, please look at file2. Then i need some data in file3 as well,. i need to combine them #cat file1 192.10.1.41 server.age.com ###### 192.10.0.40 ssh cant... (10 Replies)
Discussion started by: kenshinhimura
10 Replies

7. Shell Programming and Scripting

Combine data from two files base on uniq data

File 1 ID Name Po1 Po2 DD134 DD134_4A_1 NN-1 L_0_1 DD134 DD134_4B_1 NN-2 L_1_1 DD134 DD134_4C_1 NN-3 L_2_1 DD142 DD142_4A_1 NN-1 L_0_1 DD142 DD142_4B_1 NN-2 L_1_1 DD142 DD142_4C_1 NN-3 L_2_1 DD142 DD142_3A_1 NN-41 L_3_1 DD142 DD142_3A_1 NN-42 L_3_2 File 2 ( Combination of... (1 Reply)
Discussion started by: pareshkp
1 Replies

8. Shell Programming and Scripting

Combine data out of 3 files into one new file

Hi, How can I combine the data of of three files into one new file? I try to give as much informations as possible. The three existing files are called file1 file2 and file3 the new file should named output_combined. The size of the files will be around 900 words/lines each .. but always... (5 Replies)
Discussion started by: MyMemberName
5 Replies
bup-damage(1)						      General Commands Manual						     bup-damage(1)

NAME
bup-damage - randomly destroy blocks of a file SYNOPSIS
bup damage [-n count] [-s maxsize] [--percent pct] [-S seed] [--equal] DESCRIPTION
Use bup damage to deliberately destroy blocks in a .pack or .idx file (from .bup/objects/pack) to test the recovery features of bup-fsck(1) or other programs. THIS PROGRAM IS EXTREMELY DANGEROUS AND WILL DESTROY YOUR DATA bup damage is primarily useful for automated or manual tests of data recovery tools, to reassure yourself that the tools actually work. OPTIONS
-n, --num=numblocks the number of separate blocks to damage in each file (default 10). Note that it's possible for more than one damaged segment to fall in the same bup-fsck(1) recovery block, so you might not damage as many recovery blocks as you expect. If this is a problem, use --equal. -s, --size=maxblocksize the maximum size, in bytes, of each damaged block (default 1 unless --percent is specified). Note that because of the way bup- fsck(1) works, a multi-byte block could fall on the boundary between two recovery blocks, and thus damaging two separate recovery blocks. In small files, it's also possible for a damaged block to be larger than a recovery block. If these issues might be a problem, you should use the default damage size of one byte. --percent=maxblockpercent the maximum size, in percent of the original file, of each damaged block. If both --size and --percent are given, the maximum block size is the minimum of the two restrictions. You can use this to ensure that a given block will never damage more than one or two git-fsck(1) recovery blocks. -S, --seed=randomseed seed the random number generator with the given value. If you use this option, your tests will be repeatable, since the damaged block offsets, sizes, and contents will be the same every time. By default, the random numbers are different every time (so you can run tests in a loop and repeatedly test with different damage each time). --equal instead of choosing random offsets for each damaged block, space the blocks equally throughout the file, starting at offset 0. If you also choose a correct maximum block size, this can guarantee that any given damage block never damages more than one git-fsck(1) recovery block. (This is also guaranteed if you use -s 1.) EXAMPLE
# make a backup in case things go horribly wrong cp -a ~/.bup/objects/pack ~/bup-packs.bak # generate recovery blocks for all packs bup fsck -g # deliberately damage the packs bup damage -n 10 -s 1 -S 0 ~/.bup/objects/pack/*.{pack,idx} # recover from the damage bup fsck -r SEE ALSO
bup-fsck(1), par2(1) BUP
Part of the bup(1) suite. AUTHORS
Avery Pennarun <apenwarr@gmail.com>. Bup unknown- bup-damage(1)
All times are GMT -4. The time now is 12:32 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy