Posts Tagged ‘google’

Preparing your Linux to work with the Cloud providers – Installing aws , gcloud, az, oc, cf CLI Cloud access command interfaces

Wednesday, October 10th, 2018

howto Install-Cloud-access-tools-for-google-aws-azure-openshift-cloud-foundryCloud_computing-explained-on-linux.svg

If you're a sysadmin / developer whose boss requires a migration of Stored Data, Database structures or Web Objects to Amazon Web Services / Google Clourd or you happen to be a DevOps Engineer you will certainly need to have installed as a minimumum amazon AWS and Google Clouds clients to do daily routines and script stuff in managing cloud resources without tampering to use the Web GUI interface.

Here is how to install the aws, gcloud, oc, az and cf next to your kubernetes client (kubectl) on your Linux Desktop.
 

1. Install Google Cloud  gcloud (to manage Google Cloud platform resources and developer workflow
 

google-cloud-logo

Here is few cmds to run to install  gcloud, gcloud alpha, gcloud beta, gsutil, and bq commands to manage your Google Cloud from CLI

a.) On Debian / Ubuntu / Mint or any other deb based distro

# Create environment variable for correct distribution
export CLOUD_SDK_REPO="cloud-sdk-$(lsb_release -c -s)"

 

# Add the Cloud SDK distribution URI as a package source
# echo "deb http://packages.cloud.google.com/apt $CLOUD_SDK_REPO main" | sudo tee -a /etc/apt/sources.list.d/google-cloud-sdk.list

 

# Import the Google Cloud Platform public key
$ sudo curl https://packages.cloud.google.com/apt/doc/apt-key.gpg | sudo apt-key add –

 

# Update the package list and install the Cloud SDK
$ sudo apt-get update && sudo apt-get install google-cloud-sdk


b) On CentOS, RHEL, Fedora Linux and other rpm based ones
 

$ sudo tee -a /etc/yum.repos.d/google-cloud-sdk.repo << EOM
[google-cloud-sdk]
name=Google Cloud SDK
baseurl=https://packages.cloud.google.com/yum/repos/cloud-sdk-el7-x86_64
enabled=1
gpgcheck=1
repo_gpgcheck=1
gpgkey=https://packages.cloud.google.com/yum/doc/yum-key.gpg
       https://packages.cloud.google.com/yum/doc/rpm-package-key.gpg
EOM

# yum install google-cloud-sdk

 

That's all now the text client to talk to Google Cloud's API gcloud is installed under
/usr/bin/gcloud

Latest install instructions of Google Cloud SDK are here.


2. Install AWS Cloud command line interface tool for managing AWS (Amazon Web Services)
 

AmazonWebservices_Logo.svg

AWS client is dependent on Python PIP so before you proceed you will have to install python-pip deb package if on Debian / Ubuntu Linux use apt:

 

# apt-get install –yes python-pip

 

It is also possible to install newest version of PIP a tiny shell script provided by Amazon get-pip.py

 

# curl -O https://bootstrap.pypa.io/get-pip.py
# python get-pip.py –user

 

# pip install awscli –upgrade –user

 

3. Install Azure Cloud Console access CLI command interface
 

Microsoft_Azure_Cloud-Logo.svg

On Debian / Ubuntu or any other deb based distro:

# AZ_REPO=$(lsb_release -cs)
# echo "deb [arch=amd64] https://packages.microsoft.com/repos/azure-cli/ $AZ_REPO main" | \
$ sudo tee /etc/apt/sources.list.d/azure-cli.list

# curl -L https://packages.microsoft.com/keys/microsoft.asc | sudo apt-key add –
$ sudo apt-get update
$ sudo apt-get install apt-transport-https azure-cli

 

Finaly to check that Azure CLI is properly installed run simple login with:

 

$ az login

 


$ sudo rpm –import https://packages.microsoft.com/keys/microsoft.asc
$ sudo sh -c 'echo -e "[azure-cli]\nname=Azure CLI\nbaseurl=https://packages.microsoft.com/yumrepos/azure-cli\nenabled=1\ngpgcheck=1\ngpgkey=https://packages.microsoft.com/keys/microsoft.asc" > /etc/yum.repos.d/azure-cli.repo'
$ sudo yum install azure-cli

$ az login


For Latest install instructions check Amazon's documentation here

4. Install OpenShift OC CLI tool to access OpenShift Open Source Cloud

 

OpenShift-Redhat-cloud-platform

Even thought OpenShift has its original Redhat produced package binaries, if you're not on RPM distro it is probably
best to install using official latest version from openshift github repo.


As of time of writting this article this is done with:

 

# wget https://github.com/openshift/origin/releases/download/v1.5.1/openshift-origin-client-tools-v1.5.1-7b451fc-linux-64bit.tar.gz
tar –xvf openshift-origin-client-tools-v1.5.1-7b451fc-linux-64bit.tar.gz

 

# # mv openshift-origin-client-tools-v1.5.1-7b451fc-linux-64bit oc-tool

 

# cd oc-tool
# echo'export PATH=$HOME/oc-tool:$PATH' >> ~/.bashrc

 

To test openshift, try to login to OpenShift cloud:

 

$ oc login
Server [https://localhost:8443]: https://128.XX.XX.XX:8443


Latest install instructions on OC here

5. Install Cloud Foundry cf CLI Cloud access tool

cloud-foundry-cloud-logo

a) On Debian / Ubuntu Linux based distributions, do run:

 

$ wget -q -O – https://packages.cloudfoundry.org/debian/cli.cloudfoundry.org.key | sudo apt-key add –
$ echo "deb https://packages.cloudfoundry.org/debian stable main" | sudo tee /etc/apt/sources.list.d/cloudfoundry-cli.list
$ sudo apt-get update
$ sudo apt-get install cf-cli

 

b) On RHEL Enterprise Linux / CentOS and Fedoras

 

$ sudo wget -O /etc/yum.repos.d/cloudfoundry-cli.repo https://packages.cloudfoundry.org/fedora/cloudfoundry-cli.repo
$ sudo yum install cf-cli


For latest install insructions on cf cli check Cloud Foundry's install site

There plenty of other Cloud providers with the number exponentially growing and most have their own custom cli tools to access but as there use is not so common as the 5 ones mentioned below, I've omited 'em. If you're interested to know the complete list of Cloud Providers providing Cloud Services check here.

6. Install Ruby GEMs RHC tools collection

If you have to work with Redhat Cloud Storage / OpenShift you will perhaps want to install also (RHC) Redhat Collection Tools.

Assuming that the Linux system is running an up2date version of ruby programming language do run:

 

 

root@jeremiah:~# gem install rhc
Fetching: net-ssh-5.0.2.gem (100%)
Successfully installed net-ssh-5.0.2
Fetching: net-ssh-gateway-2.0.0.gem (100%)
Successfully installed net-ssh-gateway-2.0.0
Fetching: net-ssh-multi-1.2.1.gem (100%)
Successfully installed net-ssh-multi-1.2.1
Fetching: minitar-0.7.gem (100%)
The `minitar` executable is no longer bundled with `minitar`. If you are
expecting this executable, make sure you also install `minitar-cli`.
Successfully installed minitar-0.7
Fetching: hashie-3.6.0.gem (100%)
Successfully installed hashie-3.6.0
Fetching: powerbar-1.0.18.gem (100%)
Successfully installed powerbar-1.0.18
Fetching: minitar-cli-0.7.gem (100%)
Successfully installed minitar-cli-0.7
Fetching: archive-tar-minitar-0.6.1.gem (100%)
'archive-tar-minitar' has been deprecated; just install 'minitar'.
Successfully installed archive-tar-minitar-0.6.1
Fetching: highline-1.6.21.gem (100%)
Successfully installed highline-1.6.21
Fetching: commander-4.2.1.gem (100%)
Successfully installed commander-4.2.1
Fetching: httpclient-2.6.0.1.gem (100%)
Successfully installed httpclient-2.6.0.1
Fetching: open4-1.3.4.gem (100%)
Successfully installed open4-1.3.4
Fetching: rhc-1.38.7.gem (100%)
===========================================================================

 

If this is your first time installing the RHC tools, please run 'rhc setup'

===========================================================================
Successfully installed rhc-1.38.7
Parsing documentation for net-ssh-5.0.2
Installing ri documentation for net-ssh-5.0.2
Parsing documentation for net-ssh-gateway-2.0.0
Installing ri documentation for net-ssh-gateway-2.0.0
Parsing documentation for net-ssh-multi-1.2.1
Installing ri documentation for net-ssh-multi-1.2.1
Parsing documentation for minitar-0.7
Installing ri documentation for minitar-0.7
Parsing documentation for hashie-3.6.0
Installing ri documentation for hashie-3.6.0
Parsing documentation for powerbar-1.0.18
Installing ri documentation for powerbar-1.0.18
Parsing documentation for minitar-cli-0.7
Installing ri documentation for minitar-cli-0.7
Parsing documentation for archive-tar-minitar-0.6.1
Installing ri documentation for archive-tar-minitar-0.6.1
Parsing documentation for highline-1.6.21
Installing ri documentation for highline-1.6.21
Parsing documentation for commander-4.2.1
Installing ri documentation for commander-4.2.1
Parsing documentation for httpclient-2.6.0.1
Installing ri documentation for httpclient-2.6.0.1
Parsing documentation for open4-1.3.4
Installing ri documentation for open4-1.3.4
Parsing documentation for rhc-1.38.7
Installing ri documentation for rhc-1.38.7
Done installing documentation for net-ssh, net-ssh-gateway, net-ssh-multi, minitar, hashie, powerbar, minitar-cli, archive-tar-minitar, highline, commander, httpclient, open4, rhc after 10 seconds
13 gems installed
root@jeremiah:~#

To start with rhc next do:
 

rhc setup
rhc app create my-app diy-0.1


and play with it to install software create services on the Redhat cloud.

 

 

Closure

This are just of the few of the numerous tools available and I definitely understand there is much more to be said on the topic.
If you can remember other tools tor interesting cloud starting up tips about stuff to do on a fresh installed Linux PC to make life easier with Cloud / PaaS / SaaS / DevOps engineer please drop a comment.

Adding RSS Feed to WordPress in conjunction with FeedBurner / WordPress add-to-any-subscribe plugin

Saturday, May 15th, 2010

adding_rss_feed_to_wordpress-in-conjunction-with-Google-Feedburner-add-to-any-subscribe-plugin
I received a comment today from one of my blog readers. That he likes my blog content but he looks for a way to subscribe to my blog.
Though I had a subscription button configured in my wordpress template of choice. The button is located on place in the template that is absolutely unnoticable (at the bottom of the page). This is by the way I believe a default behaviour in case if the default wordpress plugin is used.
Thus I decided to set a clear RSS Subscription button on my blog.
Though at a first glimpse the task looked quite trivial it happened to be a way more complex!
I’ve tried a number of things before I can succeed in adding an RSS button.
The most simple though not really flexible way was through:

WordPress’s Widgets (Using the RSS Widget) .
rss default feed widget

This approach however has one major inconvenience.
Using the default wordpress RSS Widget you cannot configure the Widget to keep displaying 0 items of the feed.
In other words you cannot configure from the the sliding menu reading:
How many items would you like to display? 0 in order to prevent completely showing up of any of the feed on the page.
If you leave it with one. The RSS icon of RSS widget would point directly to your blog instead to the RSS feed configured.
So in practice configuring it especially in my case rendered completely useless.
The appearing link from the default RSS widget for some weird reason doesn’t includes a link to where my feed is located https://www.pc-freak.net/blog/feed/
In the meantime I looked online to look for something that will facilitate me in completing the simple task to add an RSS Subscribe Feed to my Blog.
I asked for help in freenode’s irc network #wordpress channel. And a guy from there suggested that I go with wp-o-matic wordpress plugin
Anyways It took me a few minutes to realize this plugin is suitabile if you want to show other blogs feed to your blog instead of adding a RSS feed link to your own blog, this wasn’t my goal so I skipped next in looking for something to help me on.
In the meantime I found the interesting feedburner google feed service that is able to help in creating, publishing and distributing RSS & Atom feeds.
I recommend you check it if you still haven’t. It takes only a few clicks to register in order to use feedburner as a feed service.
Once you are set-up with Feedburner, you can activate all the cool functions, such as allowing your readers to subscribe via email, and also the Feedburner Flares.
Another nice thing about using feedburner is that it formats your fed content in a really pretty layout. By the way since the 2004 feedburner is owned by Google. So in a certain terms using feedburner instead of the default wordpress blog feed will probably attract more google visitors to your blog and is generally good for your blog placement in search engines.
For more of the advantages check feedburner’s google service website .
However there is one major disadvantage in using the feedburner’s feed service. Using the service rob you out of control of your feed, since all the feeds will be properly generated and formatted from feedburner.
The newly created service for feedburner by me is located on www.feeds.feedburner.com/WalkingInLightWithChrist-FaithComputingDiary
Now to be able to use the newly installed feedburner service with my blog I had to test a couple of plugins before I came to the ones that really worked.
I played with feedburner_feedsmith_plugin , feedburner-widget , feedlist.2.61 , wp-keiths-easy-rss , rss-atom-avatar but I couldn’t make any of them work properly with feedburner. Some of the up mentioned plugins were a real hell to configure so I completely abandondoned them seeing their inflexibility. Others were completely abondonede for more than 2 years from now, etc. etc. Fundamentally none of them worked for me.
I finally was able to bring up the feedburner service on my worpdress using The FD Feedburner plugin

All necessery to enable the plugin after you download into wordpress’s wp-content/plugins directory is to configure the plugin from:
Plugins -> FeedBurner Configuration Here is the screenshot on the FD Feedburner plugin configuration screen:

FD Feedburner plugin

As you can see in the screenshot the plugin is really simple to configure. All you need to provide it with is the url provided to you by feedburner right after you register your blog for the service.

Now as you will have your http://www.yourwebsite.com/blog/feed/ be redirected to the feedburner’s website generated feed for your website by the FD FeedBurner WordPress Plugin all left to be done is to provide a link on your blog to your blog feed.

To accomplish this you will have to download the add-to-any-subscribe wordpress plugin.
Again installation of add-to-any-subscribe is a piece of cake, to install follow The install instructions here

Now hopefully your feedburner feed will be able to be distrubuted to your visitors via the AddtoAny subscription button on your blog.

Downloading your favourity flash video from Youtube with a simple command (youtube-dl)

Wednesday, April 13th, 2011

downloading-flash-videos-from-youtube-on-linux-and-bsd-youtube-downloader-logo
Watching videos in youtube today and already for about 2 years is the de-facto hype.
There is almost none a day passed without almost each one of us has watched a dozen videos in Youtube.

Watching videos in youtube has become even more addictive for many than the early days of Internet Relay Chats (IRC)

As youtube is very accessible for people and it’s a comparativily easy way people share more and more with the day.
There is no question that the business idea of youtube is great and youtube generates millions of dollars for Google day by day, however I have a serious objection here! All is good the only pitfall is that you don’t own the youtube videos you watch!

Youtube’s story is not that different from the story of the cloud computing threat to internet users Freedom

The good thing here is that we’re not still completely dependant on youtube and there is still way to retrieve your favourite youtube video and store it for later watching or distribution.

Probably the most famous browser plugin that allows files retrieval from youtube, as most people know is DownloadHelper .

However using download helper is browser dependant, you need to use the browser to save the plugin and I don’t find it to be the best way to download a youtube video.

Since the old days I have started using Linux, I’ve been quite addicted to as many things on my linux as possible from the command line (terminal / console) (CLI) .

In that manner of thoughts it was a real delight for me to find out that a group of free software developer guys has come up with a command line tool that allows downloads of youtube videos straight from terminal, the great software is called youtube-dl and at the moment of this post writting it’s to be found on the URL address:

http://rg3.github.com/youtube-dl/

Youtube-dl is written in python so, it requires the Python interpreter, version 2.5 in order to properly run on Unix, Mac OS X or even on Windows!

The fact that it’s written in python has made the little shiny tool quite a multi-platform one.
To start using immediately the tool on a Debian or Ubuntu Linux you will have to install python (even though in most cases you must have it already installed):

1. To make sure you have python interpreter installed issue the cmd:

debian:~# apt-get install python
Building dependency tree
Reading state information... Done
python is already the newest version.
0 upgraded, 0 newly installed, 0 to remove and 0 not upgraded.

As you can see from above apt-get’s output I do have it installed so nothing gets installed.

2. As a next step I used links to download the youtube-dl python script, like so:

debian:~# links https://github.com/rg3/youtube-dl/raw/2011.03.29/youtube-dl >> youtube-dl
Use the links interface to save youtube-dl and use gzip to ungzip it
debian:~# gzip -d youtube-dl.gz
debian:~# chmod +x youtube-dl

Now to make it system wide accessible I have copied the youtube-dl to /usr/local/bin , whether I selected /usr/local/bin as a location as this location is predetermined to contain mostly files which does not belong to a regular deb package.

3. Move youtube-dl to /usr/local/bin

debian:~# mv youtube-dl /usr/local/bin

4. Test the newly installed youtube-dl command line youtube retrieval tool:

debian:~# ./youtube-dl https://www.youtube.com/watch?v=g7tvI6JCXD0
[youtube] Setting language
[youtube] g7tvI6JCXD0: Downloading video webpage
[youtube] g7tvI6JCXD0: Downloading video info webpage
[youtube] g7tvI6JCXD0: Extracting video information
[download] Destination: g7tvI6JCXD0.flv
[download] 53.3% of 22.62M at 33.23k/s ETA 05:25
[download] 100.0% of 22.62M at 31.91k/s ETA 00:00 [u

As you might have noticed from the above youtube-dl command output the newly retrieved youtube file will be saved under a name g7tvI6JCXD0.flv

The line I passed to youtube-dl is directly taken from my browser and pasted to console, the file downloading from youtube took me about 10 minutes but this is mostly because of some kind of youtube server speed restrictions …

In general at least I have this video for later, watching, so after a while I can watch it once again without loosing a lot of time trying to remember what was the video headline name

5. To use youtube-dl in a bit advanced way you can for instance invoke the command with options like:

debian:~# ./youtube-dl -l -w -c https://www.youtube.com/watch?v=g7tvI6JCXD0
[youtube] Setting language
[youtube] g7tvI6JCXD0: Downloading video webpage
[youtube] g7tvI6JCXD0: Downloading video info webpage
[youtube] g7tvI6JCXD0: Extracting video information
[download] Destination: BSD is Dying, Jason Dixon, NYCBSDCon 2007-g7tvI6JCXD0.flv
[download] 4.4% of 22.62M at 1.43M/s ETA 00:15

As you can see now youtube-dl was even able to detect the downloaded video file name and store it on the computer with a correct name 😉

I would recommend you also to check out the youtube-dl help page, to do use command: youtube-dl –help
 

Change default browser to Internet Explorer

Wednesday, September 18th, 2013

Almost no sane person and security aware person uses Internet Explorer still. However still in huge American companies it is heavily used. If you install Firefox or Google Chrome and by mistake you change default browser to one of them then it is worthy revert back default browser to Internet Explorer.

Here is how to do it;
Open Internet Explorer and navigate to:

Tools -> Interent Options -> Programs -> click on (Make Default)

Internet-Explorer-Internet-Options-screenshot-on-Windows-7

change-default-browser-to-internet-explorer-make-default-button-screenshot

Done
 

How to install Google Chrome web browser on Debian 7 Wheezy Linux

Wednesday, September 4th, 2013

How to install Google Chrome web browser on Debian Gnu Linux Chrome and tux logo
Just installed Debian 7 Linux and wondered how to install Google Chrome Browser on Debian Wheezy. It took me a while until I figure it out, as direct download from Google after searching for Chrome Linux had library requirements which are missing from Debian 7 Wheezy repositories.
Here is how;

1. Add  Wheezy Backports and Google's Chrome Repository to /etc/apt/sources.list

echo 'deb http://ftp.debian.org/debian/ wheezy-backports main contrib non-free' >> /etc/apt/sources.list
echo 'deb http://dl.google.com/linux/chrome/deb/ stable main' >> /etc/apt/sources.list

2. Install Google Chrome with apt-get

Here you have few options install Google Chrome Beta (whether you prefer you're an innovator), install unstable – if you prefer latest functionality and don't count on stability or install stable version.

a) Install Google Chrome Beta

apt-get install --yes google-chrome-beta

b) Install Google Chrome Unstable

apt-get install --yes google-chrome-unstable

c) Install Google Stable

apt-get install --yes google-chrome-stable

I personally prefer always to keep stable so prefer to install google-chrome-stable.

Only reason I need Google-Chrome is for testing how websites looks with it. Otherwise I don't recommend this browser to anyone who cares for his security. Obviously as Chrome is product of Google it is almost certainly it keeps complete surveillance on what you do on the net.

That's all happy web development with Chrome on Debian 🙂
 

Linux: Generating Web statistics from Old Apache logs with Webalizer

Thursday, July 25th, 2013

Webalizer generate and visualize in web page statistics of old websites howto webalizer static html google analytics like statistics on linux logo

Often it happens, that some old hosted websites were created in a way so no Web Statistics are available. Almost all modern created websites nowadays are already set to use Google AnalyticsAnyhow every now and then I stumble on hosting clients whose websites creator didn't thought on how to track how many hits or unique visitors site gets in a month / year etc.
 Thanksfully this is solvable by good "uncle" admin with help with of Webalizer (with custom configuration) and a little bit of shell scripting.

The idea is simple, we take the old website logs located in lets say 
/var/log/apache2/www.website-access.log*,
move files to some custom created new directory lets say /root/www.website-access-logs/ and then configure webalizer to read and generate statistics based on log in there.

For the purpose, we have to have webalizer installed on Linux system. In my case this is Debian GNU / Linux.

For those who hear of Webalizer for first time here is short package description:

debian:~# apt-cache show webalizer|grep -i description -A 2

Description-en: web server log analysis program
The Webalizer was designed to scan web server log files in various formats
and produce usage statistics in HTML format for viewing through a browser.

 If webalizer is not installed still install it with:

debian:~# apt-get install --yes webalizer
...
.....

Then make backup copy of original / default webalizer.conf (very important step especially if server is already processing Apache log files with some custom webalizer configuration:

debian:~# cp -rpf /etc/webalizer/webalizer.conf /etc/webalizer/webalizer.conf.orig

Next step is to copy webalizer.conf with a name reminding of website of which logs will be processed, e.g.:

debian:~# cp -rpf /etc/webalizer/webalizer.conf /etc/webalizer/www.website-webalizer.conf

In www.website-webalizer.conf config file its necessary to edit at least 4 variables:

LogFile /var/log/apache2/access.log
OutputDir /var/www
#Incremental no
ReportTitle Usage statistics for

 Make sure after modifying 3 vars read something like:  
LogFile /root/www.website/access_log_merged_1.log
OutputDir /var/www/www.website
Incremental yes
ReportTitle Usage statistics for Your-Website-Host-Name.com

Next create /root/www.website and /var/www/www.website, then copy all files you need to process from /var/log/apache2/www.website* to /root/www.website:

debian:~# mkdir -p /root/www.website
debian:~# cp -rpf /var/log/apache2/www.website* /root/www.website

On Debian Apache uses logrotate to archive old log files, so all logs except www.website-access.log and wwww.website-access.log.1 are gzipped:

debian:~#  cd /root/www.website
debian:~# ls 
www.website-access.log.10.gz
www.website-access.log.11.gz
www.website-access.log.12.gz
www.website-access.log.13.gz
www.website-access.log.14.gz
www.website-access.log.15.gz
www.website-access.log.16.gz
www.website-access.log.17.gz
www.website-access.log.18.gz
www.website-access.log.19.gz
www.website-access.log.20.gz
...
 

Then we have to un-gzip zipped logs and create one merged file from all of them ready to be red later by Webalizer. To do so I use a tiny shell script like so:

for n in {52..1}; do gzip -d www.dobrudzhatour.net-access.log.$n.gz; done
for n in {52..1}; do cat www.dobrudzhatour.net-access.log.$n >> access_log_merged_1.log;
done

First look de-gzips and second one does create a merged file from all with name access_merged_1.log The range of log files in my case is from www.website-access.log.1 to www.website-access.log.52, thus I have in loop back number counting from 52 to 1.

Once access_log_merged_1.log is ready we can run webalizer to process file (Incremental) and generate all time statistics for www.website:

debian:~# webalizer -c /etc/webalizer/webalizer-www.website-webalizer.conf

Webalizer V2.01-10 (Linux 2.6.32-27-server) locale: en_US.UTF-8
Using logfile /root/www.website/access_log_merged_1.log (clf)
Using default GeoIP database Creating output in /var/www/webalizer-www.website
Hostname for reports is 'debian'
Reading history file… webalizer.hist
Reading previous run data.. webalizer.current
333474 records (333474 ignored) in 37.50 seconds, 8892/sec

To check out just generated statistics open in browser:

http://yourserverhost/webalizer-www.website/

or

http://IP_Address/webalizer-www.website

 You should see statistics pop-up, below is screenshot with my currently generated stats:

Webalizer website access statistics screenshot Debian GNU Linux

Checking port security on Linux with Nmap – Just another Nmap examples tutorial

Sunday, June 9th, 2013

Scanning with nmap checking computer network security Linux FreeBSD Windows Nmap logo
Nmap
(Network Mapper) is one of the most essential tools for checking server security. As a penetration testing instrument it is both used by SysAdmins / Crackers and Security Specialists. Its perfect too to make periodic port audits and determine how good is configured server firewall or even in time of building one. Often with time Firewall rules grow bigger and bigger and as a consequence there is a risk of loopholes in FW rules, nmap routine host checks (i.e. run as a cronjob and logging port status on server is IMHO a good preventive measure).

I first get introduced to Nmap in the early days of my careers as IT Geek and System Administrator around year 2000. Back then Computer Security and hacking culture was a common thing across IT geeks and ppl hanging in IRC 😉 This article will not say much of news for those accustomed to Nmap, but hope interesting for people newly introduced to Computer Security it will be of use.


1. Checking host status with Nmap (Is remote scanned host up).

There is plenty of ways to check, whether remote host is reachable, ping is classics, but not always relevant as many network admins decide to filter ping for security reasons. Of course one can do manual try outs with telnet on common Services Ports (Apache, Mail, Squid, MySQL etc. / 80,25,8080, 3306), or even write on own prog to do so but its worthless as Nmap is already there with options for this and its report in about 90% of cases is relevant:

To check whether host is up with Nmap:

pcfreak:~# nmap -sP google.com

Starting Nmap 6.00 ( http://nmap.org ) at 2013-06-08 11:58 EEST
Nmap scan report for google.com (173.194.39.227)
Host is up (0.013s latency).
Other addresses for google.com (not scanned): 173.194.39.238 173.194.39.231 173.194.39.226 173.194.39.232 173.194.39.230 173.194.39.233 173.194.39.228 173.194.39.225 173.194.39.229 173.194.39.224
rDNS record for 173.194.39.227: sof01s02-in-f3.1e100.net
Nmap done: 1 IP address (1 host up) scanned in 0.74 seconds

2. Port map with Quick remote host (connect) scan

Most classical way of scanning, since the early days of computing is to  attempt connecting to remote host ports opening connection via creating new TCP or UDP protocol socket with C's connect(); function. Hence nmap's "default" way of scanning is like so. Anyways it doesn't scan all possible 65534 ports, when run with no extra arguments, but instead scans only those more popular widespread used.

noah:~# nmap -sT www.pc-freak.net

 

Starting Nmap 5.00 ( http://nmap.org ) at 2013-06-08 15:05 EEST
Stats: 0:00:01 elapsed; 0 hosts completed (1 up), 1 undergoing Connect Scan
Connect Scan Timing: About 2.00% done; ETC: 15:07 (0:01:38 remaining)
Stats: 0:00:02 elapsed; 0 hosts completed (1 up), 1 undergoing Connect Scan
Connect Scan Timing: About 24.40% done; ETC: 15:05 (0:00:09 remaining)
Stats: 0:00:03 elapsed; 0 hosts completed (1 up), 1 undergoing Connect Scan
Connect Scan Timing: About 77.25% done; ETC: 15:05 (0:00:01 remaining)
Interesting ports on www.pc-freak.net (83.228.93.76):
Not shown: 985 filtered ports
PORT     STATE  SERVICE
20/tcp   closed ftp-data
21/tcp   open   ftp
22/tcp   open   ssh
25/tcp   open   smtp
53/tcp   open   domain
80/tcp   open   http
110/tcp  open   pop3
143/tcp  open   imap
443/tcp  closed https
465/tcp  open   smtps
631/tcp  closed ipp
993/tcp  open   imaps
995/tcp  closed pop3s
8022/tcp open   unknown
9001/tcp open   tor-orport

Nmap done: 1 IP address (1 host up) scanned in 4.69 seconds
 

During scan, pressing Enter, prints on screen statistics on how many percentage of scan is completed. In older Nmap, releases this was not so, it is very convenient stuff, as some host scans (with specific firewalls), can have anti port scan rules making the scan time ultra luggish. If this is the case nmap can be run in different scan mode, I'm gonna say few words on that later.

3. Nmap – Scanning only selected ports of interest and  port range

a) Scanning only desired ports
Whether scanning a complete range of IPs from C or B class network, it is handy to only scan only ports of interests for example (Apache, SMTP, POP3, IMAP etc.).
Here is how to scan those 4;

noah:~# nmap -sT www.pc-freak.net -p 80,25,110,143

 

Starting Nmap 6.00 ( http://nmap.org ) at 2013-06-08 15:49 EEST
Stats: 0:00:00 elapsed; 0 hosts completed (0 up), 1 undergoing Ping Scan
Ping Scan Timing: About 100.00% done; ETC: 15:49 (0:00:00 remaining)
Stats: 0:00:00 elapsed; 0 hosts completed (1 up), 1 undergoing Connect Scan
Connect Scan Timing: About 100.00% done; ETC: 15:49 (0:00:00 remaining)
Nmap scan report for www.pc-freak.net (83.228.93.76)
Host is up (0.20s latency).
PORT    STATE SERVICE
25/tcp  open  smtp
80/tcp  open  http
110/tcp open  pop3
143/tcp open  imap

Nmap done: 1 IP address (1 host up) scanned in 1.00 seconds

List of all common network services with port number is located in /etc/services

b) Scanning a port range

By default nmap does not scan all the ports in the low ports range 1-1024. This port range according to RFC standards are reserved for standard more often and high priority network services. Default's nmap scan does not scan all of the 1-1024 ports and sometimes, some people prefer to run services in non-standard port numbers on some obscure ports in those port range. It is common that some "hacked (cracked is proper word here)", have secretly install Connect Shell or Connect back shell services running in those port range. Thus scanning those port range on administrated servers (especially whether there is suspicion for intrusion).

noah:~# nmap -sT www.pc-freak.net -p 1-1024

 

 

Starting Nmap 5.00 ( http://nmap.org ) at 2013-06-08 15:47 EEST
Stats: 0:00:04 elapsed; 0 hosts completed (1 up), 1 undergoing Connect Scan
Connect Scan Timing: About 77.44% done; ETC: 15:47 (0:00:01 remaining)
Stats: 0:00:04 elapsed; 0 hosts completed (1 up), 1 undergoing Connect Scan
Connect Scan Timing: About 84.86% done; ETC: 15:47 (0:00:01 remaining)
Interesting ports on www.pc-freak.net (83.228.93.76):
Not shown: 1011 filtered ports
PORT    STATE  SERVICE
20/tcp  closed ftp-data
21/tcp  open   ftp
22/tcp  open   ssh
25/tcp  open   smtp
53/tcp  open   domain
80/tcp  open   http
110/tcp open   pop3
143/tcp open   imap
443/tcp closed https
465/tcp open   smtps
631/tcp closed ipp
993/tcp open   imaps
995/tcp closed pop3s

4. Scanning all possible ports to make complete node port audit

As I said prior, if no extra port arguments nmap scans only number of pre-selected high use ports. However it is always nice to run complete port scan. Doing complete port scan on host, can reveal unusual open ports for cracker backdoors or ports or whether on Windows (ports open by Viruses and Trojans). As the complete number of possible remote ports to attempt to connect to is (65536), such a scan is much slower and sometimes can take literally "ages". To scan all ports on my home router in a local 100 M/Bit network with my notebook it takes about 23 minutes. On remote hosts it can take from 30 / 40 minutes to many hours – depending on firewall type on remote scanned host. Also by scanning all ports, there is risk remote host add you to its FW reject rules, whether its running some kind of automated software for Intrusion Detection (IDS) like Snort or AIDE.
To run complete port scan with nmap;

noah:~# nmap -sT www.pc-freak.net -p 0-65535
 

Starting Nmap 6.00 ( http://nmap.org ) at 2013-06-08 22:28 EEST
Stats: 0:00:01 elapsed; 0 hosts completed (1 up), 1 undergoing Connect Scan
Connect Scan Timing: About 0.03% done
Stats: 0:00:01 elapsed; 0 hosts completed (1 up), 1 undergoing Connect Scan
Connect Scan Timing: About 0.05% done
Stats: 0:06:35 elapsed; 0 hosts completed (1 up), 1 undergoing Connect Scan
Connect Scan Timing: About 31.23% done; ETC: 22:50 (0:14:28 remaining)
Stats: 0:06:35 elapsed; 0 hosts completed (1 up), 1 undergoing Connect Scan
Connect Scan Timing: About 31.24% done; ETC: 22:50 (0:14:27 remaining)
Stats: 0:08:21 elapsed; 0 hosts completed (1 up), 1 undergoing Connect Scan
Connect Scan Timing: About 37.41% done; ETC: 22:51 (0:13:57 remaining)
Stats: 0:08:21 elapsed; 0 hosts completed (1 up), 1 undergoing Connect Scan
Connect Scan Timing: About 37.43% done; ETC: 22:51 (0:13:56 remaining)
Stats: 0:08:21 elapsed; 0 hosts completed (1 up), 1 undergoing Connect Scan
Connect Scan Timing: About 37.46% done; ETC: 22:51 (0:13:56 remaining)
Stats: 0:08:22 elapsed; 0 hosts completed (1 up), 1 undergoing Connect Scan
Connect Scan Timing: About 37.50% done; ETC: 22:51 (0:13:55 remaining)
Stats: 0:08:22 elapsed; 0 hosts completed (1 up), 1 undergoing Connect Scan
Connect Scan Timing: About 37.53% done; ETC: 22:51 (0:13:56 remaining)
Stats: 0:08:28 elapsed; 0 hosts completed (1 up), 1 undergoing Connect Scan
Connect Scan Timing: About 37.96% done; ETC: 22:51 (0:13:50 remaining)
Stats: 0:11:55 elapsed; 0 hosts completed (1 up), 1 undergoing Connect Scan
Connect Scan Timing: About 53.22% done; ETC: 22:51 (0:10:28 remaining)
Nmap scan report for www.pc-freak.net (83.228.93.76)
Host is up (0.0023s latency).
Not shown: 65518 filtered ports
PORT     STATE  SERVICE
20/tcp   closed ftp-data
21/tcp   open   ftp
22/tcp   open   ssh
25/tcp   open   smtp
53/tcp   open   domain
80/tcp   open   http
110/tcp  open   pop3
143/tcp  open   imap
443/tcp  closed https
465/tcp  open   smtps
631/tcp  closed ipp
993/tcp  open   imaps
995/tcp  closed pop3s
2060/tcp open   unknown
2070/tcp open   ah-esp-encap
2207/tcp closed unknown
8022/tcp open   oa-system
9001/tcp open   tor-orport

Nmap done: 1 IP address (1 host up) scanned in 1367.73 seconds

5. Scanning a network range of IPs with NMAP

It is common thing to scan a network range in C class network, especially as usually we admins have to administrate a number of hosts running in a local network:

 

noah:~# nmap -sP '192.168.0.*'

Starting Nmap 6.00 ( http://nmap.org ) at 2013-06-08 22:29 EEST
Stats: 0:00:01 elapsed; 0 hosts completed (0 up), 256 undergoing Ping Scan
Ping Scan Timing: About 0.98% done
Stats: 0:00:09 elapsed; 0 hosts completed (0 up), 256 undergoing Ping Scan
Parallel DNS resolution of 256 hosts. Timing: About 0.00% done
Nmap scan report for 192.168.0.16
Host is up (0.00029s latency).
Nmap done: 256 IP addresses (1 host up) scanned in 9.87 seconds

You can also scan class C network with:

>noah:~# nmap -sP 192.168.1.0/24

6. Obtaining network services version numbers

Nmap is capable digging version numbers of remote running application binding to port:. Option to try to guess obtain version number is -sV (Show Version).

noah:~# nmap -sV www.pc-freak.net

Starting Nmap 6.00 ( http://nmap.org ) at 2013-06-08 22:35 EEST
Stats: 0:00:05 elapsed; 0 hosts completed (1 up), 1 undergoing Connect Scan
Service scan Timing: About 90.91% done; ETC: 22:37 (0:00:09 remaining)
Nmap scan report for www.pc-freak.net (83.228.93.76)
Host is up (0.0083s latency).
Not shown: 985 filtered ports
PORT     STATE  SERVICE         VERSION
20/tcp   closed ftp-data
21/tcp   open   ftp             ProFTPD 1.3.3a
22/tcp   open   ssh             OpenSSH 5.5p1 Debian 6+squeeze3 (protocol 2.0)
25/tcp   open   smtp            qmail smtpd
53/tcp   open   domain?
80/tcp   open   http            Apache httpd
110/tcp  open   pop3            qmail pop3d
143/tcp  open   imap            Courier Imapd (released 2005)
443/tcp  closed https
465/tcp  open   ssl/smtp        qmail smtpd
631/tcp  closed ipp
993/tcp  open   tcpwrapped
995/tcp  closed pop3s
8022/tcp open   http            ShellInABox httpd
9001/tcp open   ssl/tor-orport?
Service Info: Host: mail.www.pc-freak.net; OSs: Unix, Linux; CPE: cpe:/o:linux:kernel

Service detection performed. Please report any incorrect results at http://nmap.org/submit/ .
Nmap done: 1 IP address (1 host up) scanned in 126.37 seconds

 

7. Checking remote server OS version

 noah:~# nmap -O www.pc-freak.net

 

Starting Nmap 6.00 ( http://nmap.org ) at 2013-06-08 22:42 EEST
Nmap scan report for www.pc-freak.net (83.228.93.76)
Host is up (0.0017s latency).
Not shown: 985 filtered ports
PORT     STATE  SERVICE
20/tcp   closed ftp-data
21/tcp   open   ftp
22/tcp   open   ssh
25/tcp   open   smtp
53/tcp   open   domain
80/tcp   open   http
110/tcp  open   pop3
143/tcp  open   imap
443/tcp  closed https
465/tcp  open   smtps
631/tcp  closed ipp
993/tcp  open   imaps
995/tcp  closed pop3s
8022/tcp open   oa-system
9001/tcp open   tor-orport
Device type: general purpose|broadband router|WAP|media device
Running (JUST GUESSING): Linux 2.6.X|2.4.X|3.X (94%), Gemtek embedded (89%), Siemens embedded (89%), Netgear embedded (88%), Western Digital embedded (88%), Comtrend embedded (88%)
OS CPE: cpe:/o:linux:kernel:2.6 cpe:/o:linux:kernel:2.4.20 cpe:/o:linux:kernel:3 cpe:/o:linux:kernel:2.4
Aggressive OS guesses: Linux 2.6.32 – 2.6.35 (94%), Vyatta 4.1.4 (Linux 2.6.24) (94%), Linux 2.6.32 (93%), Linux 2.6.17 – 2.6.36 (93%), Linux 2.6.19 – 2.6.35 (93%), Linux 2.6.30 (92%), Linux 2.6.35 (92%), Linux 2.4.20 (Red Hat 7.2) (92%), Linux 2.6.22 (91%), Gemtek P360 WAP or Siemens Gigaset SE515dsl wireless broadband router (89%)
No exact OS matches for host (test conditions non-ideal).

OS detection performed. Please report any incorrect results at http://nmap.org/submit/ .
Nmap done: 1 IP address (1 host up) scanned in 8.76 seconds

As you can see from above output OS version guess is far from adequate, as my home router is running a Debian Squeeze. However in some older Linux releases, where services return OS version nr., it reports proper.

8. Scanning silently with Nmap SYN (Stealth Scan)

As many servers run some kind of IDS logging attempts to connect to multiple ports on the host and add scanning IP to filtering CHAIN. It is generally good idea to always scan with SYN Scan. SYN scan is not a guarantee that scanning attempt will not be captured by well configured IDS, or admin snorting on network with tcpdump,trafshow or iptraf. Stealth scan is useful to prevent IDS from raising red lamps.

noah:~# nmap -sS www.pc-freak.net

Starting Nmap 6.00 ( http://nmap.org ) at 2013-06-08 22:57 EEST
Nmap scan report for www.pc-freak.net (83.228.93.76)
Host is up (0.0075s latency).
Not shown: 985 filtered ports
PORT     STATE  SERVICE
20/tcp   closed ftp-data
21/tcp   open   ftp
22/tcp   open   ssh
25/tcp   open   smtp
53/tcp   open   domain
80/tcp   open   http
110/tcp  open   pop3
143/tcp  open   imap
443/tcp  closed https
465/tcp  open   smtps
631/tcp  closed ipp
993/tcp  open   imaps
995/tcp  closed pop3s
8022/tcp open   oa-system
9001/tcp open   tor-orport

Nmap done: 1 IP address (1 host up) scanned in 7.73 seconds

 

9. Nmap Scan Types (Paranoid | sneaky | polite | normal | insane)

Nmap has 6 modes of scanning. Whether no Type of scan is passed on with (-T) arg. , it scans in normal mode. Paranoid and sneaky are the slowest but lest aggressive and less likely to be captured by automated firewall filtering rules soft or IDS.

Insane mode is for people, who want to scan as quickly as possible not caring about consequences. Usually whether scanning your own hosts Insane is nice as it saves you time.

Paranoid scan is ultra, slow so in general, such scan is helpful if you're going to sleep and you  want to scan your concurrent company servers, without being identified. Paraonid scan, takes hours and depending on where remote scanned host is located can sometimes take maybe 12 to 24 hours.
noah:~# nmap -T0 www.pc-freak.net

Starting Nmap 6.00 ( http://nmap.org ) at 2013-06-09 00:23 EEST
Stats: 0:15:00 elapsed; 0 hosts completed (1 up), 1 undergoing SYN Stealth Scan
SYN Stealth Scan Timing: About 0.05% done
Almost always -T3 or T4 is reasonable.

10. Scanning hosts in verbose mode

pcfreak:~# nmap -vv localhost

Starting Nmap 5.00 ( http://nmap.org ) at 2013-06-09 01:14 EEST
NSE: Loaded 0 scripts for scanning.
Initiating SYN Stealth Scan at 01:14
Scanning localhost (127.0.0.1) [1000 ports]
Discovered open port 21/tcp on 127.0.0.1
Discovered open port 111/tcp on 127.0.0.1
Discovered open port 22/tcp on 127.0.0.1
Discovered open port 53/tcp on 127.0.0.1
Discovered open port 993/tcp on 127.0.0.1
Discovered open port 143/tcp on 127.0.0.1
Discovered open port 110/tcp on 127.0.0.1
Discovered open port 80/tcp on 127.0.0.1
Discovered open port 3306/tcp on 127.0.0.1
Discovered open port 25/tcp on 127.0.0.1
Discovered open port 783/tcp on 127.0.0.1
Discovered open port 8022/tcp on 127.0.0.1
Discovered open port 9001/tcp on 127.0.0.1
Discovered open port 465/tcp on 127.0.0.1
Completed SYN Stealth Scan at 01:14, 0.09s elapsed (1000 total ports)
Host localhost (127.0.0.1) is up (0.0000070s latency).
Scanned at 2013-06-09 01:14:27 EEST for 1s
Interesting ports on localhost (127.0.0.1):
Not shown: 986 closed ports
PORT     STATE SERVICE
21/tcp   open  ftp
22/tcp   open  ssh
25/tcp   open  smtp
53/tcp   open  domain
80/tcp   open  http
110/tcp  open  pop3
111/tcp  open  rpcbind
143/tcp  open  imap
465/tcp  open  smtps
783/tcp  open  spamassassin
993/tcp  open  imaps
3306/tcp open  mysql
8022/tcp open  unknown
9001/tcp open  tor-orport

Read data files from: /usr/share/nmap
Nmap done: 1 IP address (1 host up) scanned in 0.21 seconds
           Raw packets sent: 1000 (44.000KB) | Rcvd: 2014 (84.616KB)

 

11. Nmap typical scan arguments combinations

noah:~# nmap -sS -P0 -sV www.pc-freak.net

Stats: 0:01:46 elapsed; 0 hosts completed (1 up), 1 undergoing Service Scan
Service scan Timing: About 90.91% done; ETC: 01:22 (0:00:10 remaining)
Nmap scan report for www.pc-freak.net (83.228.93.76)
Host is up (0.0063s latency).
Not shown: 985 filtered ports
PORT     STATE  SERVICE         VERSION
20/tcp   closed ftp-data
21/tcp   open   ftp             ProFTPD 1.3.3a
22/tcp   open   ssh             OpenSSH 5.5p1 Debian 6+squeeze3 (protocol 2.0)
25/tcp   open   smtp            qmail smtpd
53/tcp   open   domain?
80/tcp   open   http            Apache httpd
110/tcp  open   pop3            qmail pop3d
143/tcp  open   imap            Courier Imapd (released 2005)
443/tcp  closed https
465/tcp  open   ssl/smtp        qmail smtpd
631/tcp  closed ipp
993/tcp  open   tcpwrapped
995/tcp  closed pop3s
8022/tcp open   http            ShellInABox httpd
9001/tcp open   ssl/tor-orport?
Service Info: Host: mail.www.pc-freak.net; OSs: Unix, Linux; CPE: cpe:/o:linux:kernel

Service detection performed. Please report any incorrect results at http://nmap.org/submit/ .
Nmap done: 1 IP address (1 host up) scanned in 106.23 seconds
 

12. Logging nmap output

Nmap can output logs in Plain Text (TXT) / GNMAP and XML. I prefer logging to TXT, as plain text is always better:
noah:~# nmap www.pc-freak.net -o nmap-log.txt

Starting Nmap 6.00 ( http://nmap.org ) at 2013-06-09 01:32 EEST
Stats: 0:00:01 elapsed; 0 hosts completed (1 up), 1 undergoing Connect Scan
Connect Scan Timing: About 4.60% done; ETC: 01:32 (0:00:21 remaining)
Nmap scan report for www.pc-freak.net (83.228.93.76)
Host is up (0.013s latency).
Not shown: 985 filtered ports
PORT     STATE  SERVICE
20/tcp   closed ftp-data
21/tcp   open   ftp
22/tcp   open   ssh
25/tcp   open   smtp
53/tcp   open   domain
80/tcp   open   http
110/tcp  open   pop3
143/tcp  open   imap
443/tcp  closed https
465/tcp  open   smtps
631/tcp  closed ipp
993/tcp  open   imaps
995/tcp  closed pop3s
3306/tcp closed mysql
8022/tcp open   oa-system

Nmap done: 1 IP address (1 host up) scanned in 5.23 seconds

Below is also a paste from nmap man page (Example section) nmap -Pn -p80 -oX logs/pb-port80scan.xml -oG logs/pb-port80scan.gnmap 216.163.128.20/20

This scans 4096 IPs for any web servers (without pinging them) and saves the output in grepable and XML formats.

13. Other good Nmap scanning examples and arguments

One very useful Nmap option is;
-A – Enables OS detection and Version detection, Script scanning and Traceroute

Whether you have a list of all IPs administrated by you and you would like to scan all of them;

noah:~# nmap -iL /root/scan_ip_addresses.txt

Other useful option is -sA (This does TCP ACK Scan), it is useful way to determine if remote host is running some kind of stateful firewall. Instead of connecting to ports to check whether opened, ACKs are send.

– Fast port Scan

noah:~# nmap -F www.pc-freak.net
...

-D argument (Decoy scanning
Nmap has option for simulating port scan from multiple IPs, the so called Decoyed scanning. Using Decoys, one can hide real IP address from which Nmap scan is initiated

# nmap -n -D192.168.1.5,10.5.1.2,172.1.2.4,3.4.2.1 192.168.1.5

– Scan firewall for security weaknesses

(TCP Null Scan to full firewall to generate responce)
# nmap -sN 10.10.10.1

(TCP Fin scan to check firewall)

  # nmap -sF 10.10.10.1

(TCP Xmas scan to check firewall)

# nmap -sX 10.10.10.1

– Scan UDP ports

# nmap -sU hostname

– Scan remote host using IP (ping) Protocol

noah:~# nmap -P0 www.pc-freak.net

Connect Scan Timing: About 96.20% done; ETC: 23:16 (0:00:00 remaining)
Nmap scan report for www.pc-freak.net (83.228.93.76)
Host is up (0.0099s latency).
Not shown: 985 filtered ports
PORT     STATE  SERVICE
20/tcp   closed ftp-data
21/tcp   open   ftp
22/tcp   open   ssh
25/tcp   open   smtp
53/tcp   open   domain
80/tcp   open   http
110/tcp  open   pop3
143/tcp  open   imap
443/tcp  closed https
465/tcp  open   smtps
631/tcp  closed ipp
993/tcp  open   imaps
995/tcp  closed pop3s
8022/tcp open   oa-system
9001/tcp open   tor-orport

Nmap done: 1 IP address (1 host up) scanned in 4.97 seconds

 

Linux: Fixing Qmail server qmail-smtpd port 25 slow (lagged) connect problem

Thursday, May 16th, 2013

qmail logo fixing qmail mail SMTP port 25 connect delays

After updating my Debian Squeeze to latest stable packages from repository with standard:
# apt-get update && apt-get upgrade

I routinely checked, if afterwards all is fine with Qmail?, just to find out connect to port 25 was hell delayed about 40-50 seconds before qmail responds with standard assigned Mail Greeting.
I Googled long time to see if I can find a post or forum thread discussing, exact issue, but though I found similar discussions I didn't found anything that exactly match problem. Thus I decided to follow the good old experimental try / fail method to figure out what causes it.

elow is pastes from telnet, illustrating delays in Qmail SMTP greeting respond:

# telnet localhost 25
Trying 127.0.0.1...
Connected to localhost.
Escape character is '^]'.

# telnet localhost 25
Trying 127.0.0.1...
Connected to localhost.
Escape character is '^]'.

# telnet localhost 25
Trying 127.0.0.1...
Connected to localhost.
Escape character is '^]'.

I spend about 2 hours, checking Qmail for the standard so common errors, usually causing it to not work properly following my previous article testing qmail installation problems

After going, through all of possible causes the only clue for problems, were some slowness with spamassassin. This brought me the idea that something is done wrong with spamassassin .I tried disabling, Spamassassin Razon and Pyzor restarting spamd through (in my case done not via the standard start/stop debian script) but through daemontools with svc and qmailctl i.e.:

# svc -d /service/spamd
# svc -u /service/spamd
# svc -a /service/spamd

qmailctl restart
* Stopping qmail-smtpdssl.
* Stopping qmail-smtpd.
* Sending qmail-send SIGTERM and restarting.
* Restarting qmail-smtpd.
* Restarting qmail-smtpdssl.
* Restarting qmail-pop3d.
This doesn't help, so I continued trying to figure out, what is wrong .One assumption for slow  qmail-smtpd responce was of course slow DNS resolve issues. I checked /etc/resolv.conf to find out server is configured to use local  configured DJBDNS server as first line DNS resolver. I used djbdns for it is simple and easy to configure, however it is a bit obsolete so it was possible bottleneck. After commenting line to use localhost 127.0.0.1
and settings as primary DNS Google Public DNS 8.8.8.8, problem persisted so problems with hosts resolving was obviously not the problem.

I pondered for about 30 minutes, checking again all logs and checking machine processes. Just to remember before I experienced similar issues caused by unresolving RBL (blacklist IP) hosts. I checked configured SPF records in
(process list) and noticed following 4 hosts;

# ps auxwwf

7190 ?        S      0:00 tcpserver -vR -l /var/qmail/control/me -c 30 -u 89 -g 89 -x /etc/tcp.smtp.cdb 0 25 rblsmtpd -t0 -r zen.spamhaus.org -r dnsbl.njabl.org -r dnsbl.sorbs.net -r bl.spamcop.net qmail-smtpd /var/qmail/control/me /home/vpopmail/bin/vchkpw /bin/true
 

I checked one by one hosts and find out 1st two hosts in line are no longer resolving (blacklist is no longer accessible) as before:

 

zen.spamhaus.org, dnsbl.njabl.org

DNSBL (DNS blocklist) is configured on this host via /service/qmail-smtpd/run, hence to remove two unresolvable hosts forcing the weird qmail-smtpd connect delay I had to modify in it:

RBL_BAD="zen.spamhaus.org dnsbl.njabl.org dnsbl.sorbs.net bl.spamcop.net"

to

RBL_BAD="dnsbl.sorbs.net bl.spamcop.net"

After a close examinations in mail server config /var/qmail/control/spfrules, found one other Unresolvable SPF Blacklist host configured ;
# cat /var/qmail/control/spfrules
include:spf.trusted-forwarder.org

To move that one I null-ed file:

# cat /dev/null > /var/qmail/control/spfrules

Finally to take affect all changes, launched Qmail start:

# qmailctl restart
Restarting qmail:
* Stopping qmail-smtpdssl.
* Stopping qmail-smtpd.
* Sending qmail-send SIGTERM and restarting.
* Restarting qmail-smtpd.
* Restarting qmail-smtpdssl.
* Restarting qmail-pop3d.

To check all was fine afterwards, again used telnet:

# telnet localhost 25
Trying 127.0.0.1...
Connected to localhost.
Escape character is '^]'.
220 This is Mail Pc-Freak.NET ESMTP

Mail greeting now appears in about 2-3 seconds time.