SQUID :- Maybe you have a broken user ID in your


 
Thread Tools Search this Thread
Operating Systems Linux Red Hat SQUID :- Maybe you have a broken user ID in your
# 1  
Old 04-23-2015
SQUID :- Maybe you have a broken user ID in your

Hi All,

I have squid running in our environment as proxy server and sarg is configured to view squid reports from web.
Some time what happened, it stopped creating logs and to check the i execute the below command :-

Code:
#sarg -x

Code:
SARG: searching for 'x20'
SARG: getword backtrace:
SARG: 1:sarg [0x4068e7]
SARG: 2:sarg [0x40711b]
SARG: 3:sarg [0x40d211]
SARG: 4:/lib64/libc.so.6(__libc_start_main+0xf4) [0x30d321d9c4]
SARG: 5:sarg [0x4028a9]
SARG: Maybe you have a broken user ID in your /var/log/squid/access.log file

I am using sarg version: 2.3.4 on CentOS5.10 x64

Everytime i have to delete logs from access.log file to fix this issue.

Can you please help me on this.
Login or Register to Ask a Question

Previous Thread | Next Thread

7 More Discussions You Might Find Interesting

1. IP Networking

Squid vs iptables = no Squid access.log?

Hello, I have a pretty useless satellite link at home (far from any civilization), so I wanted to set up caching in order to speed things up. My Squid 2.6 runs "3128 transparent" and is set up quite well on a separate machine. I also have my dd-wrt router to move all port 80 traffic through... (0 Replies)
Discussion started by: theWojtek
0 Replies

2. UNIX for Dummies Questions & Answers

broken pipe

I know what causes 'broken pipe' errors, but I can't understand why I can get it (only occasionally) with my example: ps -ef | grep "\./ans$" | grep -v grep Basically I'm searching the ps output looking for the process I'm interested in and excluding the process that is grepping for the... (2 Replies)
Discussion started by: fhew
2 Replies

3. Solaris

hard disk broken, relocate user's home dir. All setting missing

Hi, guys I am a super newbie in solaris admin.... Now I got a problem. Due to the broken of one of Raid system, some user home directory is missing. Someone advises me to relocate the home directory on the new drive. I have googled and have tried " usermode -d " command. But I found a... (3 Replies)
Discussion started by: dr_dw
3 Replies

4. UNIX for Dummies Questions & Answers

Broken Pipe

Hi I tried to open the man page of sh and piped to `pg`. Normally while reading a file page by page using `pg`, if we wanna quit at the middle of file, we give "q" near the colon mode. Ex1: $cat file1 | pg hi how r u : (page1) now press "return key", it will go to next page yes i ... (3 Replies)
Discussion started by: ramkrix
3 Replies

5. UNIX for Advanced & Expert Users

Broken Pipe

Hi I tried to open the man page of sh and piped to `pg`. Normally while reading a file page by page using `pg`, if we wanna quit at the middle of file, we give "q" near the colon mode. Ex1: $cat file1 | pg hi how r u : (page1) now press "return key", it will go to next page yes i ... (1 Reply)
Discussion started by: ramkrix
1 Replies

6. Linux

is grep -i broken?

Is grep -i (case insensitive) broken? If not, can someone explain to me why the third grep fails with no output? $ echo DBI DBH | grep DB DBI DBH $ echo dbi dbh | grep -i 'DB' dbi dbh $ echo DBI DBH | grep -i DB $ grep --version grep (GNU grep) 2.5.1 Copyright 1988, 1992-1999,... (2 Replies)
Discussion started by: tphyahoo
2 Replies

7. UNIX for Advanced & Expert Users

Broken

Ok i am running Linux, or rather was. I can not longer do anything. This was a dns server amoungst other things. It will no longer boot. I have used a startup disk, but how can i recover the OS? I need help and urgently. Please someone thanks (3 Replies)
Discussion started by: ollyparkhouse
3 Replies
Login or Register to Ask a Question
SARG(1) 							       SARG								   SARG(1)

NAME
sarg - Squid Analysis Report Generator SYNOPSIS
sarg [options] [logfile...] DESCRIPTION
sarg is a log file parser and analyzer for the Squid Web Proxy Cache[1]. It allows you to view "where" your users are going to on the Internet. sarg generates reports in HTML with fields such as: users, IP Addresses, bytes, sites, and times. These HTML files can appear in your web server's directory for browsing by users or administrators. You may also have sarg email the reports to the Squid Cache administrator. sarg can read squid or Microsoft ISA access logs. Optionally, it can complement the reports with the log of a Squid filter/redirector such as squidGuard[2]. OPTIONS
A summary of options is included below. -h Show summary of options. -a hostname|ip address Limits report to records containing the specified hostname/ip address -b filename Enables UserAgent log and writes it to filename. Warning This option is currently unused. -c filename Read filename for a list of the web hosts to exclude from the report. See the section called "HOST EXCLUSION FILE". --convert Convert a squid log file date/time field to a human-readable format. All the log files are read and output as one text on the standard output. --css Output, on the standard output, the internal css sarg inlines in the reports. You can redirect the output to a file of your choice and edit it. Then you can override the internal css with external_css_file in sarg.conf. Using an external css can reduce the size of the report file. If you are short on disk space, you may consider exporting the css as explained above. -d date Use date to restrict the report to some date range during log file processing. Format for date is dd/mm/yyyy-dd/mm/yyyy or a single date dd/mm/yyyy. Date ranges can also be specified as day-n, week-n, or month-n where n is the number of days, weeks or months to jump backward. Note that there is no spaces around the hyphen. -e email Sends report to email (stdout for console). -f filename Reads configuration from filename. -g e|u Sets date format in generated reports. e = Europe -> dd/mm/yy u = USA -> mm/dd/yy -i Generates reports by user and ip address. Note This requires the report_type option in config file to contain "users_sites". --keeplogs Don't delete any old report. It is equivalent to setting --lastlog 0 but is provided for convenience. -l filename Uses filename as the input log. This option can be repeated up to 255 times to read multiple files. If the files end with the extension .gz, .bz2 or .Z they are decompressed. If the file name is just -, the log file is read from standard input. In that case, it cannot be compressed. This option is kept for compatibility with older versions of sarg but, starting with sarg 2.3, the log files may be named on the command line without the -l option. It allows the use of wildcards on the command line. Make sure you don't exceed the limit of 255 files. --lastlog n Limit the number of logs kept in the output directory to n. Any supernumerary report is deleted starting with the oldest report. The value of n must be positive or zero. A value of zero means no report should be deleted. -L filename Reads a proxy redirector log file such as one created by squidGuard or Rejik. If you use this option, you may want to configure redirector_log_format in sarg.conf to match the output format of your web content filtering program. This option can be repeated up to 64 times to read multiple files. -n Enables ip address resolution. -o dir Writes report in dir. -p Generates reports using ip address instead of userid. -P prefix --splitprefix prefix This option must be used with --split. If it is provided, the input log is split among several files each containing one day. The name of the output files is made of the prefix and the date formated as -YYYY-MM-DD. The output files are written in the output directory specified with -o or in the current directory. -s string Limits report to the site specified by string [eg. www.debian.org] --split Split the squid log file and output it as text on the standard output omitting the dates outside of the range specified by the -d parameter. If it is combined with --convert the dates are also converted to a human-readable format. Combined with -P, the log is written in several files each containing one day of the original log. -t string Limits the records included in the report based on time-of-day. Format for string is HH:MM or HH:MM-HH:MM. The former reports only the requested time. The latter reports any entry falling within the requested range. This limit complement the limit imposed by option -d. -u user Limits reports to user activities. -w dir Store temporary files in dir. In fact, sarg stores its temporary files in the sarg subdirectory of dir. Be sure to set the HTML output directory to a place outside of the temporary directory or sarg may fail or delete the report when it completes its task. -x Writes debug messages to stdout -z Writes process messages to stdout. HOST EXCLUSION FILE
Sarg can be told to exclude visited hosts from the report by providing it with a file containing one host to exclude per line. The "host" may be one of the following: o a full host name, o a host name starting with a wildcard (*) to match any prefix, o a single ip address, o a subnet noted a.b.c.d/e. Example 1. Example of a hosts exclusion file *.google.com 10.0.0.0/8 Sarg cannot exclude IPv6 addresses at the moment. SEE ALSO
squid(8) AUTHORS
This manual page was written by Luigi Gangitano gangitano@lugroma3.org, for the Debian GNU/Linux system (but may be used by others). Revised by Billy Newsom. Currently maintained by Frederic Marchal fmarchal@users.sourceforge.net. AUTHORS
Frederic Marchal <fmarchal@users.sourceforge.net> Docbook version of the manual page Billy Newsom Revision of the manual page Luigi Gangitano <gangitano@lugroma3.org> Author of the first manual page COPYRIGHT
Copyright (C) 2011 Frederic Marchal NOTES
1. Squid Web Proxy Cache http://www.squid-cache.org/ 2. squidGuard http://www.squidguard.org/ sarg 25 Jan 2011 SARG(1)