Hi all, I'm running BIND 9.1.3 and the accompanying `dig and nslookup' on a vanilla Redhat 7.2 Linux box.
I've produced all of my Zone data and config files (I used h2n with some manual tweaks to do this, as some >= v8.2 BIND features aren't properly supported as far as I can see).
Im my... (6 Replies)
First I would like to thank you for your time in running a great Forum!
Background - Windows/ASP/VB COM/SQL Server programmer/Webmaster.
Desire - To build similar skillset on UNIX. I am looking at learning Perl or Python (maybe Jython due to connection to Java). I have a brief background... (3 Replies)
When I use the linux dig command such as #dig yahoo.com it resolves
but when I use the same command as root it gives me error "Segmentation Fault"
Please advise I am completly baffled. (1 Reply)
all,
i am newbie to dns bind . Any help is very appreciated.
I am using dig command to view the records in the config. I am expecting the following comamnds to display all the A (Address records) in the zone data file.
my zone data file looks like this
-------------------
$ORIGIN .
$TTL... (2 Replies)
Hi Guys,
I just need a confirmation if what think i know is right .
dig yahoo.com
; <<>> DiG 9.7.0-P1 <<>> yahoo.com
;; global options: +cmd
;; Got answer:
;; ->>HEADER<<- opcode: QUERY, status: NOERROR, id: 27410
;; flags: qr rd ra; QUERY: 1, ANSWER: 5, AUTHORITY: 0, ADDITIONAL: 0
... (1 Reply)
Can I use two different DNS servers in the one command in the form of primary and secondary.
Take this for example:
dig @<primaryAddress> @<secondaryAddress> MX domain.tld
So if primary address is down, it will use the secondary address as a backup. It seems to work when testing, but thought... (1 Reply)
Hi,
I have these entries in the /etc/esolv.conf:
------------
domain xxxxxx
search yyyyyy
nameserver 127.0.0.1
nameserver aaaaaaaaaaaaaaaa
nameserver bbbbbbbbbbbbbbbb
-------------
When I use 'dig' or 'nslookup' command, like 'dig yahoo.com' it uses the localhost as the server.
I... (2 Replies)
I'm using a .txt file filled with domain names for dig to use, the problem is that when i look at the results I get the query time for each individual query, I want to know how long it took in total for all queries to run, how can I achieve this? any help would be greatly appreciated, thank you.... (3 Replies)
Experts - I was hoping someone could help me out with the logic on this perl script.
I'm trying to run some dig commands and parse in such a way as to group them together.
Here's what I have so far.
#!/usr/bin/perl
system(clear);
my @host = qw/yahoo.com
google.com
/;
foreach... (2 Replies)
Discussion started by: timj123
2 Replies
LEARN ABOUT DEBIAN
net::google::code::issue
Net::Google::Code::Issue(3pm) User Contributed Perl Documentation Net::Google::Code::Issue(3pm)NAME
Net::Google::Code::Issue - Google Code Issue
SYNOPSIS
use Net::Google::Code::Issue;
my $issue = Net::Google::Code::Issue->new( project => 'net-google-code' );
$issue->load(42);
DESCRIPTION ATTRIBUTES
project
project name
email, password
user's email and password
id
status
owner
reporter
reported
merged
stars
closed
cc
summary
description
labels
comments
attachments
INTERFACE
load
parse
updated
the last comment's date.
create comment, summary, status, owner, cc, labels, files.
update comment, summary, status, owner, merge_into, cc, labels, blocked_on, files.
list( q => '', can => '', author => '', id => '', label => '', max_results => '', owner => '', published_min => '', published_max => '',
updated_min => '', updated_max => '', start_index => '' )
google's api way to get/search issues
return a list of loaded issues in list context, a ref to the list otherwise.
load_comments
google's api way to get and load comments( no scraping is done here )
parse_hybrid
when $USE_HYBRID is true, we will try to load issue with the google's official api, but as the api is not complete, we still need to do
scraping to load something( e.g. attachments ), this method is used to do this.
AUTHOR
sunnavy "<sunnavy@bestpractical.com>"
LICENCE AND COPYRIGHT
Copyright 2008-2010 Best Practical Solutions.
This program is free software; you can redistribute it and/or modify it under the same terms as Perl itself.
perl v5.10.1 2010-04-28 Net::Google::Code::Issue(3pm)