Sponsored Content
Top Forums UNIX for Beginners Questions & Answers Help with HDFS Linux permission Post 302996068 by desind on Tuesday 18th of April 2017 11:30:18 AM
Old 04-18-2017
I agree that i cannot change others home directory. But should'nt i be able to change the permissions of my home directory ?

This desind directory in HDFS is my own directory.

Code:
/adhoc/desind

 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

Need help how to copy few records from hdfs to UNIX

Hi All , I am facing one issue here...I have a huge file in hadoop file system.Some disk space issues are thr ,thatswhy I want to copy 1st 100 records from hdfs to local unix.I tried below command but it is no working .Its giving error like cat: Unable to write to output stream.if any one can... (2 Replies)
Discussion started by: STCET22
2 Replies

2. Red Hat

SE Linux write permission denied

Hi, In my server I am getting below errors in "/var/log/messages": Oct 8 14:45:44 LKOGOMEEMM01 kernel: type=1400 audit(1444295744.792:15818): avc: denied { write } for pid=53421 comm="ip" path="/var/VRTSvcs/log/tmp/IPMultiNIC-8" dev=dm-0 ino=2754879 scontext=system_u:system_r:ifconfig_t:s0... (4 Replies)
Discussion started by: rochitsharma
4 Replies

3. Shell Programming and Scripting

Change file permission of mounted drive Linux

I got a problem with the permission of mounted 2TB drive in my Linux/Mint system. All the files in any folder are with 777, which is not what I want. my fstab line for this disk is: UUID=90803E0C803DF974 /media/grape/Workspace1_ntfs ntfs auto,users,permissions 0 0 and blkid gave me: $> blkid ... (4 Replies)
Discussion started by: yifangt
4 Replies

4. UNIX for Advanced & Expert Users

UNIX and HDFS - file systems on same partition.

I am learning Hadoop. As a part of that, Hdfs - Hadoop distributed file system has commands similar to unix where we can create,copy,move files from unix/linux file system to HDFS. My question is 1) how two file systems (unix and hdfs) can coexist on thr same partition.. 2)What if block used... (0 Replies)
Discussion started by: Narendra Eliset
0 Replies

5. UNIX for Beginners Questions & Answers

UNIX and HDFS - file systems on same partition.

I am learning Hadoop. As a part of that, Hdfs - Hadoop distributed file system has commands similar to unix where we can create,copy,move files from unix/linux file system to HDFS. My question is 1) how two file systems (unix and hdfs) can coexist on thr same partition.. 2)What if block... (1 Reply)
Discussion started by: Narendra Eliset
1 Replies

6. Shell Programming and Scripting

Strange permission issue on Linux server.

Starting sendmail throws this error: I decided to see the permission on the files under /etc/mail and this is the output: # ls -ltr total 284 -rwxr-xr-x. 1 root root 1847 Jan 27 2014 virtusertable -rwxr-xr-x. 1 root root 127 Jan 27 2014 trusted-users -rwxr-xr-x. 1 root root 92... (3 Replies)
Discussion started by: mohtashims
3 Replies

7. Shell Programming and Scripting

Read CSV file and delete hdfs, hive and hbase tables

I have a CSV file with hdfs directories, hive tables and hbase tables. 1. first column - hdfs directories 2. second column - hive tables 3. third column - hbase tables I have to check the csv file and look for the first column and delete the hdfs directory from the hdfs path, now... (2 Replies)
Discussion started by: shivamayam
2 Replies

8. Shell Programming and Scripting

How to check total files size in hdfs directory?

Is there a way to calculate the total file size of HDFS file directory in GB or MB? I dont want to use du/df command. Without that is there a way HDFS Directory - /test/my_dir (1 Reply)
Discussion started by: rohit_shinez
1 Replies

9. Shell Programming and Scripting

Shell Script for HDFS Ingestion Using JDBC

Peers, I was in process of building a script that connects to salesforce using jdbc and pull the data using spark and process in hive table. During this process I have encountered a problem where and variable assigned with hadoop command that list files in Azure Data lake is not parsing the... (2 Replies)
Discussion started by: jg355187
2 Replies

10. Shell Programming and Scripting

To get latest hdfs file system

Hi All, I am having below hdfs file system /user/home/dte=2019_01_30/part_1 /user/home/dte=2019_01_30/part_2 /user/home/dte=2019_01_31/part_1 I need to take the latest month hdfs folder while passing date as parameter. For eg . if i pass as Feb month i.e. 20190201(YYYYMMDD), then... (0 Replies)
Discussion started by: Master_Mind
0 Replies
PAM_CHROOT(8)						    BSD System Manager's Manual 					     PAM_CHROOT(8)

NAME
pam_chroot -- Chroot PAM module SYNOPSIS
[service-name] module-type control-flag pam_chroot [arguments] DESCRIPTION
The chroot service module for PAM chroots users into either a predetermined directory or one derived from their home directory. If a user's home directory as specified in the passwd structure returned by getpwnam(3) contains the string ``/./'', the portion of the directory name to the left of that string is used as the chroot directory, and the portion to the right will be the current working directory inside the chroot tree. Otherwise, the directories specified by the dir and cwd options (see below) are used. also_root Do not hold user ID 0 exempt from the chroot requirement. always Report a failure if a chroot directory could not be derived from the user's home directory, and the dir option was not specified. cwd=directory Specify the directory to chdir(2) into after a successful chroot(2) call. dir=directory Specify the chroot directory to use if one could not be derived from the user's home directory. SEE ALSO
pam.conf(5), pam(8) AUTHORS
The pam_chroot module and this manual page were developed for the FreeBSD Project by ThinkSec AS and NAI Labs, the Security Research Division of Network Associates, Inc. under DARPA/SPAWAR contract N66001-01-C-8035 (``CBOSS''), as part of the DARPA CHATS research program. BSD
February 10, 2003 BSD
All times are GMT -4. The time now is 07:15 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy