I am writing a script that using the "Total Bytes" field from hdiutil imageinfo -plist <file>. My intention is to get the total mounted size of a compressed dmg. It works for some images, but sometimes it doesn't seem to match up, particularly with larger (over 1 gb) images. Can anybody explain why this is happening, and if there is a more accurate way to get the mounted size through the command line?
This first part of the script is supposed to get the total size of the disk image when mounted:
This part of the script restores images to their corresponding partitions. The size of the partitions is based on the total size of the mounted disk image.
osx has a fairly old version of bash if I recall, one which still uses 32-bit integers for number calculations. Try keeping a count of blocks or megabytes instead of raw bytes.
Unfortunately I don't think that hdiutil reports a block count for disk images. Do you know of another way to get this information? I could find it for the compressed image file, but not for the size the image is when mounted.
Do you mean "the total uncompressed size of a compressed dmg" ?
Please post the exact version of your O/S (blanking anything confidential like machine names with X's).
Please post sample output from your hdiutil command for the largest file involved:
Quote:
The size of the partitions is based on the total size of the mounted disk image.
What are the mathematical calculations for this? Are any of these "partitions" larger than exactly 2 Gb ?
The important field here is "Total Bytes." There are no mathematical calculations, it is parsed directly from the hdiutil -plist. Pretty much every image I have is larger than 2 gb. These are service diagnostics, and each one is basically a fully-functioning bootable OS. I need to get multiple drives set up with 30+ of these images, which is why I wrote the script. Everything works perfectly except this one operation. Hopefully I (or we!) can find a solution!
---------- Post updated at 06:38 PM ---------- Previous update was at 06:24 PM ----------
I'm thinking that we aren't going to find a way to fix this issue directly. Unless anyone has any other suggestions, I think I might have to look at finding a way to round the size up to the nearest GB size and use that instead. I don't think I can do that in bash though, but I'm pretty sure I can figure out how to do it an awk. I know this isn't the scripting forum, but if one of you has any insight on how to do that I'd appreciate it.
Of course, I'm still open to any other solutions anyone comes up with.
I have prepared a shell script to find the duplicates based on the part of filename and retain latest.
#!/bin/bash
if ; then
mkdir -p dup
fi
NOW=$(date +"%F-%H:%M:%S")
LOGFILE="purge_duplicate_log-$NOW.log"
LOGTIME=`date "+%Y-%m-%d %H:%M:%S"`
echo... (6 Replies)
I am trying to get the mounted size of a compressed .dmg. I can use hdiutil imageinfo -plist $imagename to get a lot of info, including what I need. The problem is that I have no idea how to parse that single piece of info into a variable (or in this case, array element). Here is what I have so... (2 Replies)
How can I use the cp command to copy every file that I can find inside several folders
cp -R *test* folder
Supose there is
./122342343teste122343.txt
./bound/123teste1223453.txt
./feed/123teste1223453.txt
and i want the files 122342343teste122343.txt, bound/123teste1223453.txt... (12 Replies)
This perhaps does not belong in ths category; apologies, however, we have a heated debate going and your input will decide the result.
Should UNIX (HP, AIX, etc) be rebooted following a monthly cycle (Every month, or a qtr, etc.). We have some UX admins (grumps) who say they have seen a UX... (6 Replies)