Sponsored Content
Full Discussion: squid cache directories
Operating Systems Linux squid cache directories Post 46144 by RajaRC on Friday 9th of January 2004 01:36:17 AM
Old 01-09-2004
squid cache directories

Hi,

I have redhat linux 9 and squid installed on it. I have three cache directories on my filesystem

The size of each varies --

/cache0 600 MB
/cache1 1024 MB
/cache2 900 MB

I have checked that all the 3 directories the disk space utilization same, does that mean the squid copies same files to all the three directories or does it keep different files on all the directories.

Regards,
Raja
 

9 More Discussions You Might Find Interesting

1. UNIX for Advanced & Expert Users

UBC cache vs. Metadata cache

hi, What is the difference between UBC cache and Metadata cache ? where can i find UBC cache Hits and Metadata cache Hits in hp-ux? Advanced thanx for the help. (2 Replies)
Discussion started by: sushaga
2 Replies

2. Linux

Squid Cache Directory

Hi, I have redhat linux 9 and squid installed on it. I would like to know which (ufs, diskd, async) format I should use in my squid setup for the squid directory. Also if possible can anyone tell me which is the best. Regards, Raja (2 Replies)
Discussion started by: RajaRC
2 Replies

3. Linux

squid dynamic cache

by default squid don`t make cache of dynamic content e.g ( .asp files). I need to cache asp files in my proxy, anyone know how can I do this task? (2 Replies)
Discussion started by: miguens
2 Replies

4. Web Development

squid -> deny cache of all dynamic asp websites

hi howto configure 2.6.STABLE5 to deny caching of .asp webpages? (3 Replies)
Discussion started by: ccc
3 Replies

5. Linux

getting info on Cache Size, Data Cache etc..

Hi all I saw in Microsoft web site www.SysInternals.com a tool called CoreInfo from able to print out on screen the size of the Data and Instruction caches of your processor, the Locigal to Physical Processor mapping, the number of the CPU sockets. etc.. Do you know if in Linux is available a... (2 Replies)
Discussion started by: manustone
2 Replies

6. Linux

File cache /Page cache Linux

Hi All, could any one point out any open source test-suites for "File cache" testing and as well as performance test suites for the same. Currently my system is up with Linux/ext4. Regards Manish (0 Replies)
Discussion started by: hmanish
0 Replies

7. Linux

how to configure Squid with ONE Network Card for cache and proxy

hello all, how i can configure Squid with ONE Network Card for cache and proxy as shown in image attached. (1 Reply)
Discussion started by: jazaib92
1 Replies

8. IP Networking

Squid vs iptables = no Squid access.log?

Hello, I have a pretty useless satellite link at home (far from any civilization), so I wanted to set up caching in order to speed things up. My Squid 2.6 runs "3128 transparent" and is set up quite well on a separate machine. I also have my dd-wrt router to move all port 80 traffic through... (0 Replies)
Discussion started by: theWojtek
0 Replies

9. Solaris

Giving read write permission to user for specific directories and sub directories.

I have searched this quite a long time but couldn't find the right method for me to use. I need to assign read write permission to the user for specific directories and it's sub directories and files. I do not want to use ACL. This is for Solaris. Please help. (1 Reply)
Discussion started by: blinkingdan
1 Replies
hardlink(1)						      General Commands Manual						       hardlink(1)

NAME
hardlink - Consolidate duplicate files via hardlinks SYNOPSIS
hardlink [-c] [-n] [-v] [-vv] [-h] directory1 [ directory2 ... ] DESCRIPTION
This manual page documents hardlink, a program which consolidates duplicate files in one or more directories using hardlinks. hardlink traverses one or more directories searching for duplicate files. When it finds duplicate files, it uses one of them as the mas- ter. It then removes all other duplicates and places a hardlink for each one pointing to the master file. This allows for conservation of disk space where multiple directories on a single filesystem contain many duplicate files. Since hard links can only span a single filesystem, hardlink is only useful when all directories specified are on the same filesystem. OPTIONS
-c Compare only the contents of the files being considered for consolidation. Disregards permission, ownership and other differ- ences. -f Force hardlinking across file systems. -n Do not perform the consolidation; only print what would be changed. -v Print summary after hardlinking. -vv Print every hardlinked file and bytes saved. Also print summary after hardlinking. -h Show help. AUTHOR
hardlink was written by Jakub Jelinek <jakub@redhat.com>. Man page written by Brian Long. Man page updated by Jindrich Novy <jnovy@redhat.com> BUGS
hardlink assumes that its target directory trees do not change from under it. If a directory tree does change, this may result in hardlink accessing files and/or directories outside of the intended directory tree. Thus, you must avoid running hardlink on potentially changing directory trees, and especially on directory trees under control of another user. hardlink(1)
All times are GMT -4. The time now is 11:23 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy