Sponsored Content
Full Discussion: Hadoop file system
Top Forums UNIX for Dummies Questions & Answers Hadoop file system Post 302924222 by gull04 on Friday 7th of November 2014 04:12:04 AM
Old 11-07-2014
Hi Stew,

The script does not create hadoop distributed file systems, the script creates standard linux ext4 file systems which Hadoop will use. Below you can see the output from the command;

Code:
fdisk -l /dev/sda

You will see under the "id" column the number "83" this identifies the file system type.

Code:
[root@ekbit13 ~]# fdisk -l /dev/sda

Disk /dev/sda: 300.0 GB, 300000000000 bytes
255 heads, 63 sectors/track, 36472 cylinders
Units = cylinders of 16065 * 512 = 8225280 bytes
Sector size (logical/physical): 512 bytes / 512 bytes
I/O size (minimum/optimal): 512 bytes / 512 bytes
Disk identifier: 0x12909dab

   Device Boot      Start         End      Blocks   Id  System
/dev/sda1   *           1          33      262144   83  Linux
Partition 1 does not end on cylinder boundary.
/dev/sda2              33        8388    67108864   83  Linux
/dev/sda3            8388       16743    67108864   83  Linux
/dev/sda4           16743       36473   158487854    5  Extended
/dev/sda5           16743       25097    67108864   83  Linux
/dev/sda6           25098       33452    67108864   83  Linux
/dev/sda7           33452       34497     8388608   82  Linux swap / Solaris
/dev/sda8           34497       36473    15877120   83  Linux
[root@ekbit13 ~]#

To see how the file systems are used you can run the "mount" command like this;

Code:
[root@ekbit13 ~]# mount
/dev/sda2 on / type ext4 (rw)
proc on /proc type proc (rw)
sysfs on /sys type sysfs (rw)
devpts on /dev/pts type devpts (rw,gid=5,mode=620)
tmpfs on /dev/shm type tmpfs (rw)
/dev/sda1 on /boot type ext4 (rw)
/dev/sdb1 on /home type ext4 (rw)
/dev/sda3 on /opt type ext4 (rw)
/dev/sda8 on /tmp type ext4 (rw)
/dev/sda5 on /usr/local type ext4 (rw)
/dev/sda6 on /var type ext4 (rw)
none on /proc/sys/fs/binfmt_misc type binfmt_misc (rw)
sunrpc on /var/lib/nfs/rpc_pipefs type rpc_pipefs (rw)
gvfs-fuse-daemon on /root/.gvfs type fuse.gvfs-fuse-daemon (rw,nosuid,nodev)
[root@ekbit13 ~]#

Hope that helps.

Regards

Dave
 

5 More Discussions You Might Find Interesting

1. UNIX for Dummies Questions & Answers

how to mount a file system of a remote machine to local file system

Hi friends, In my case, there are serveral PCs running Linux in a LAN. I would like to to mount the directory /A_river of machine-A to the file system of another machine machine-B so that I can access files in that directory. I do not know how to do this. The situation is complicated by... (2 Replies)
Discussion started by: cy163
2 Replies

2. UNIX for Dummies Questions & Answers

Hadoop HUE - Error

Hi , I have installed Cloudera Hadoop successfully however when i am launching the web page for HUE , I am receiving below error message. Configuration files located in /var/run/cloudera-scm-agent/process/65-hue-HUE_SERVER Potential misconfiguration detected. Fix and restart... (4 Replies)
Discussion started by: rakesh_411
4 Replies

3. Linux

Hadoop - Adding Datanode

Hi Team , I have a new server which needs to be made as datanode for my existing single node cloudera setup . I did gone through some of the online documentation and most of them said to install to through cloudera manager , However my server cannot open for internet , Hence using Cloudera... (0 Replies)
Discussion started by: rakesh_411
0 Replies

4. Red Hat

Hadoop - Upgrade Issue

Hi All, I was recently trying to upgrade Hadoop from CDH4 to CDH5 ,however after upgrade found below issue , Looks like this is a known issue but unable to find a proper solution for it . log4j:ERROR Could not find value for key log4j.appender.EventCounter log4j:ERROR Could not instantiate... (3 Replies)
Discussion started by: Tomlight
3 Replies

5. Programming

To check file size in hadoop

Hi Guys, i am writng a script to check if the file size is lessthan limit say 10KB then exit i need to check in hadoop file system , can someone advise Minsize=10 for file in `hadoop fs -ls path/*` do Actualsize=$(du -k "$file" | cut -f 1) if ; then echo "File generated... (7 Replies)
Discussion started by: rohit_shinez
7 Replies
All times are GMT -4. The time now is 09:11 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy