Live Upgrade Patching Error: Unable to write vtoc


 
Thread Tools Search this Thread
Operating Systems Solaris Live Upgrade Patching Error: Unable to write vtoc
# 1  
Old 10-31-2012
Live Upgrade Patching Error: Unable to write vtoc

Attempting to patch several servers using live upgrade

Release: Oracle Solaris 10 8/11 s10x_u10wos_17b X86

Error I'm receiving is in the message in the log below

Code:
tail -15 /var/svc/log/rc6.log
Legacy init script "/etc/rc0.d/K50pppd" exited with return code 0.
Executing legacy init script "/etc/rc0.d/K52llc2".
Legacy init script "/etc/rc0.d/K52llc2" exited with return code 0.
Executing legacy init script "/etc/rc0.d/K62lu".
Live Upgrade: Deactivating current boot environment <sol10_U10x86>.
Live Upgrade: Executing Stop procedures for boot environment <sol10_U10x86>.
Live Upgrade: Current boot environment is <sol10_U10x86>.
Live Upgrade: New boot environment will be <patched_05102012>.
Live Upgrade: Activating boot environment <patched_05102012>.
ERROR: Live Upgrade: Unable to write the vtoc for boot environment
<patched_05102012> to device </dev/rdsk/c0t5000C50039FEF187d0s2>.
Partition 0 not aligned on cylinder boundary: "0 2 00 20352 286657920"
ERROR: Live Upgrade: Activation of boot environment <patched_05102012> FAILED.
Legacy init script "/etc/rc0.d/K62lu" exited with return code 0.

---------- Post updated at 04:10 PM ---------- Previous update was at 12:00 PM ----------

very big difference in the root size between the current BE and patched BE

Code:
# lufslist sol10_U10_101512
               boot environment name: sol10_U10_101512

Filesystem              fstype    device size Mounted on          Mount Options
----------------------- -------- ------------ ------------------- --------------
/dev/zvol/dsk/rpool/swap swap      21474836480 -                   -
rpool/ROOT/sol10_U10_101512 zfs        8365630464 /                   -
rpool/export            zfs          21292032 /export             -
rpool/export/home       zfs          21259264 /export/home        -
rpool                   zfs       32889929728 /rpool              -
# lufslist sol10_U10x86
               boot environment name: sol10_U10x86
               This boot environment is currently active.
               This boot environment will be active on next system boot.

Filesystem              fstype    device size Mounted on          Mount Options
----------------------- -------- ------------ ------------------- --------------
/dev/zvol/dsk/rpool/swap swap      21474836480 -                   -
rpool/ROOT/sol10_U10x86 zfs          59540480 /                   -
rpool/export            zfs          21292032 /export             -
rpool/export/home       zfs          21259264 /export/home        -
rpool                   zfs       32889929728 /rpool              -

# 2  
Old 11-01-2012
Partition 0 not aligned on cylinder boundary - comp.unix.solaris

Maybe the boundary thing inflated the size?
# 3  
Old 11-02-2012
awwgghh I may have to re-install these servers... how did this happen?
# 4  
Old 11-02-2012
Why? There are tools . . . .
# 5  
Old 11-02-2012
Code:
It seems that the disk layout appears to be incorrect. * /dev/rdsk/c0t5000C50039FEF187d0s0 partition map * * Dimensions: * 512 bytes/sector * 63 sectors/track * 255 tracks/cylinder * 16065 sectors/cylinder * 17847 cylinders * 17845 accessible cylinders * * Flags: * 1: unmountable * 10: read-only * * Unallocated space: * First Sector Last * Sector Count Sector * 286678272 1653 286679924 * * First Sector Last * Partition Tag Flags Sector Count Sector Mount Directory 0 2 00 20352 286657920 286678271 2 5 00 0 286678272 286678271 8 1 01 0 20352 20351

The prtvtoc shows that there are 16065 sectors per cylinder. If I take this number and multiply it the number of accessible cylinders I should get a sector count of 286679924
but ( 16065*17845 = 286679925 ) and 286679925 - 286679924 = -1 Smilie

The disk is currently only partitioned out to have 286678271 as the last sector. The unallocated space is the missing space that is now appears to be causing my or rather LU issue.

I guess I could make a flash archive of the system and use JET and the FLAR
# 6  
Old 11-02-2012
Can you fix it by trimming the partition size after a defrag?
Login or Register to Ask a Question

Previous Thread | Next Thread

6 More Discussions You Might Find Interesting

1. Solaris

How to use live-upgrade with single disk, pre-patching steps?

Hi, I have Solaris-10 x86 (running on HP hardware), with 12 non-global zones running on this. I have to install latest patch cluster/set on this server. This server is not under backup schedule, so before installing patch cluster, I want to keep a backup. In case of any issue (bad patch or... (4 Replies)
Discussion started by: solaris_1977
4 Replies

2. Solaris

Patching using live upgrade - with non-globalzone

Hi all, I would like to ask what will be the best practice for the following setup / - c0t0d0s0 - current BE (named First) / - c0t0d1s0 - alternate BE (name Second) i have a non-global zone with zonepath in /zones/myzone /mnt/opt - c0t0d2s6 (shared between the 2 BE)... (3 Replies)
Discussion started by: javanoob
3 Replies

3. Solaris

Solaris 10 patching using live upgrade with VxVM

Hello, I was assigned some Solaris machines and need to patch them to N-1, where N is the latest OS realease, which means, upgrade till one version before the latest one. I do not now a lot about Solaris. What I only know is that need to make use of live upgrade and be careful with VxVM... (4 Replies)
Discussion started by: feroccimx
4 Replies

4. Red Hat

Unable to login after patching

Hello, Sorry for posting here..i know there is another section for Urgent...However, i could not locate it... Issue: ) After patching rhel 5.8 with couple of patches. When user is trying to login via putty . They get login promt and when they type their username, screen get freezed kindaa... (3 Replies)
Discussion started by: saurabh84g
3 Replies

5. Solaris

Solaris patching issue with Live Upgrade

I have Solaris-10 sparc box with ZFS file-system, which is running two non global zones. I am in process of applying Solaris Recommended patch cluster via Live Upgrade. Though I have enough space in root file-system of both zones, everytime I run installcluster, it fails with complaining less... (7 Replies)
Discussion started by: solaris_1977
7 Replies

6. Solaris

Solaris Patching vs OS Upgrade

Hello, I have a doubt regarding patching and OS upgrade. What is the difference when i install a Recomended Patch Cluster using installcluster script on a solaris10 update 6 server vs Upgrading Solaris 10 update 6 to update 10 ? I have applied update10 Recomended Patch Cluster on a... (1 Reply)
Discussion started by: fugitive
1 Replies
Login or Register to Ask a Question