Posts Tagged ‘localhost’

Start Stop Restart Microsoft IIS Webserver from command line and GUI

Thursday, April 17th, 2014

Reading Time: 2minutes

start-stop-restart-microsoft-iis-howto-iis-server-logo
For a decomissioning project just recently I had the task to stop Microsoft IIS  on Windows Server system.
If you have been into security for a while you know well how many vulnerabilities Microsoft (Internet Information Server) Webserver used to be. Nowadays things with IIS are better but anyways it is better not to use it if possible …

Nomatter what the rason if you need to make IIS stop serving web pages here is how to do it via command line:

At Windows Command Prompt, type:

net stop WAS

If the command returns error message to stop it type:

net stop W3SVC

stop-microsoft-IIS-webservice
Just in case you have to start it again run:

net start W3SVC

start-restart-IIS-webserver-screenshot

For those who prefer to do it from GUI interface, launch services.msc command from Windows Run:

> services.msc

services-msc-stop-microsoft-iis-webserver

In list of services lookup for
IIS Admin Service and HTTP SSL
a)(Click over it with right mouse button -> Properties)
b)Set Startup type to Manual
c)Click Stop Button

You're done now IIS is stopped to make sure it is stopped you can run from cmd.exe:

telnet localhost 80

when not working you should get 'Could not open connection to the host. on port 80: Connection failed' like shown up in screenshot.

How to Copy large data directories between 2 Linux / Unix servers without direct ssh / ftp access between server1 and server2 other by using SSH, TAR and Unix pipes

Monday, April 27th, 2015

Reading Time: 3minutes

how-to-copy-large-data-directories-between-2-linux-unix-servers-without-direct-ssh-ftp-access-btween-each-other

In a Web application data migration project, I've come across a situation where I have to copy / transfer 500 Gigabytes of data from Linux server 1 (host A) to Linux server 2 (host B). However the two machines doesn't have direct access to each other (via port 22) for security reasons and hence I cannot use sshfs to mount remotely host dir via ssh and copy files like local ones.

As this is a data migration project its however necessery to migrate the data finding a way … Normal way companies do it is to copy the data to External Hard disk storage and send it via some Country Post services or some employee being send in Data center to attach the SAN to new server where data is being migrated However in my case this was not possible so I had to do it different.

I have access to both servers as they're situated in the same Corporate DMZ network and I can thus access both UNIX machines via SSH.

Thanksfully there is a small SSH protocol + TAR archiver and default UNIX pipe's capabilities hack that makes possible to transfer easy multiple (large) files and directories. The only requirement to use this nice trick is to have SSH client installed on the middle host from which you can access via SSH protocol Server1 (from where data is migrated) and Server2 (where data will be migrated).

If the hopping / jump server from which you're allowed to have access to Linux  servers Server1 and Server2 is not Linux and you're missing the SSH client and don't have access on Win host to install anything on it just use portable mobaxterm (as it have Cygwin SSH client embedded )

Here is how:
 

jump-host:~$ ssh server1 "tar czf – /somedir/" | pv | ssh server2 "cd /somedir/; tar xf


As you can see from above command line example an SSH is made to server1  a tar is used to archive the directory / directories containing my hundred of gigabytes and then this is passed to another opened ssh session to server 2  via UNIX Pipe mechanism and then TAR archiver is used second time to unarchive previously passed archived content. pv command which is in the middle is not obligitory though it is a nice way to monitor status about data transfer like below:
 

500GB 0:00:01 [10,5MB/s] [===================================================>] 27%


P.S. If you don't have PV installed install it either with apt-get on Debian:

 

debian:~# apt-get install –yes pv

 

Or on CentOS / Fedora / RHEL etc.

 

[root@centos ~]# yum -y install pv

 

Below is a small chunk of PV manual to give you better idea of what it does:

NAME
       pv – monitor the progress of data through a pipe

SYNOPSIS
       pv [OPTION] [FILE]…
       pv [-h|-V]

DESCRIPTION
       pv  allows  a  user to see the progress of data through a pipeline, by giving information such as time elapsed, percentage
       completed (with progress bar), current throughput rate, total data transferred, and ETA.

       To use it, insert it in a pipeline between two processes, with the appropriate options.  Its standard input will be passed
       through to its standard output and progress will be shown on standard error.

       pv  will  copy  each  supplied FILE in turn to standard output (- means standard input), or if no FILEs are specified just
       standard input is copied. This is the same behaviour as cat(1).

       A simple example to watch how quickly a file is transferred using nc(1):

              pv file | nc -w 1 somewhere.com 3000

       A similar example, transferring a file from another process and passing the expected size to pv:

              cat file | pv -s 12345 | nc -w 1 somewhere.com 3000


Note that with too big file transfers using PV will delay data transfer because everything will have to pass through another 2 pipes, however for file transfers up to few gigabytes its really nice to include it.

If you only need to transfer huge .tar.gz archive and you don't bother about traffic security (i.e. don't care whether transferred traffic is going through encrypted SSH tunnel and don't want to put an overhead to both systems for encrypting the data and you have some unfiltered ports between host 1 and host 2 you can run netcat on host 2 to listen for connections and forward .tar.gz content via netcat's port like so:
 

linux2:~$ nc -l -p 12345 > /path/destinationfile
linux2:~$ cat /path/sourcfile | nc desti.nation.ip.address 12345


Another way to transfer large data without having connection with server1 and server2 but having connection to a third host PC is to use rsync and good old SSH Tunneling, like so:
 

jump-host:~$ ssh -R 2200:Linux-server1:22 root@Linux-server2 "rsync -e 'ssh -p 2200' –stats –progress -vaz /directory/to/copy root@localhost:/copy/destination/dir"

Secure your work PC internet traffic using SSH Dynamic Tunnel as Proxy to get around Corporate Spy Proxy and Site Filtering

Friday, March 20th, 2015

Reading Time: 4minutes

use-ssh-dynamic-tunnel-as-socks5-proxy-to-get-around-corporate-website-filtering-restrictions

If you work for some huge corporations such as IBM / Sony / Toshiba / Concentrix / HP etc. and you're using a Windows Work Computer (notebook), pre-installed with a custom Company software which is by default configured to use a Proxy Server for all your Browsing activities and at a certain point you start being filtered some of the websites you love to visit so much because of some Corporate policies (limitations) at some filtered sites you will start getting empty pages or some   nasty filtering messages.

Even if you don't get a filtering message but you know all your Company Internal Network traffic is proxified for the sake of keeping your personal (privacy) high stop browsing using company's default proxy, because all your access requests (passwords) and queries to the internet are probably logged for later (review) in case if you enter the company's paragraph of "non-compliant employee".
If you fail on time to get around the default set "Corporate Proxy", sooner or later you will start getting filtering messages to some of the regular websites you use daily, as I did today while trying to open my personal blog (to check if there are new user comments):

Your request was denied because of its content categorization: "Hacking;Malicious Sources/Malnets;Religion"
For assistance, contact your network support team.

Screenshot of above message from today here

You see this guys or automated Proxy filter became so prudent that my site was filtered because it contains some Proof of Concept (PoC) security tools and content related to Christian (Faith) Religion. I guess its the time to think seriously is there a censorship in large corporations and how far could censorship go and if such censorship so easily adopted in large companies wouldn't same happen also on a backbone ISP level in short future??
If today my site is being filtered out to be unable to open from a corporation network because it contains "Religious" contain I would not be surprised if tomorrow, I've been prohibited to confess publicly my faith in salvation power of the Cross of our Lord Jesus Christ or even already in a blacklist because I'm trying to be a dedicated Orthodox Christian …
The fact that Religion is already perceived in same light as Hacking and Malicious Source or Malnet bots is also very eloquent and shows how very big part of people nowdays (including the person that added my site to this proxy filtering rules) think of religion and in what bad state our society and understanding of freedom and respect for others went.

Obviously it is time to react to this censorship and stop the evil corporation from spying on your traffic and logging all that matches there "kilometer long" prohibited sites filter lists. There are few ways to do that and the most straight forward is to set-up and use a Own Proxy server such as Privoxy / Polipo or Squid Proxy, however the proxy method requires that your company local network doesn't have too strick (restrictive) firewall rules (e.g. you need some port opened to the Internet such as 8080, 3128, 8118, 1080 standard port for (socks) etc.

As many companies are too restrictive in their outbound firewall rules and you might be in situation like with me where Browsers such as Internet Explorer / Opera / Firefox and Chrome are configured to use by default company proxy host (autocache.proxy-ur-company.hp.com:80) (with a custom Proxy PAC file filtering out a whole ranges of useful domains and IPs) and only allowed firewall access outside of local corporate network in on port 22 (for outside ssh session purposes) only.

Then your best way to get across such restrictive network configuration is to run your own home Linux / BSD / Windows server with opensshd installed and use OpenSSH protocol Dynamic Tunneling (Proxy socks5 like) capabilities to tunnel all your favourite Web Browser Traffic (lets say Firefox's) through your remote-home-host.com:22.

 


In short once you have installed plink.exe on your PC run manually from command line (cmd.exe)

 

plink.exe -ssh UserName@remote-home-host.com -P 22 -pw Secret_Password -D 127.0.0.1:8080 -N


For people who use MobaXTerm it is even easier as there is an integrated SSH tunneling input interface which can be used to create the SSH tunnel.

To have a quick way to Enable SSH Dynamic Tunnel button on your Desktop make a SymLink to Plink with Target below command line:

web-tunnel-maker-with-plink-win-ssh-connection-tool-screenshot-on-ms-windows-7

  • If fromLinux / *BSD / Mac OS host to create Dynamic SSH Tunnel to your remote home SSH server host run in a Terminal
     

ssh -D 8080 Username@remote-home-host.com


To start tunneling all your Web traffic via just created Dynamic SSH Tunnel to host remote-home-host.com, just set in browser's proxy options to use as proxy socks5 –localhost:8080

Secure-your-work-PC-notebook-internet-traffic-using-SSH-Dynamic-Tunnel-as-Proxy

To test whether your traffic is going to the Internet from remote-home-host.com open in just set proxy browser www.myip.ru .
You should see your home SSH server IP as IP which made the request to www.myip.ru.

How to renew self signed QMAIL toaster and QMAIL rocks expired SSL pem certificate

Friday, September 2nd, 2011

Reading Time: 3minutes

qmail_toaster_logo-fix-qmail-rocks-expired-ssl-pem-certificate

One of the QMAIL server installs, I have installed very long time ago. I've been notified by clients, that the certificate of the mail server has expired and therefore I had to quickly renew the certificate.

This qmail installation, SSL certificates were located in /var/qmail/control under the names servercert.key and cervercert.pem

Renewing the certificates with a new self signed ones is pretty straight forward, to renew them I had to issue the following commands:

1. Generate servercert encoded key with 1024 bit encoding

debian:~# cd /var/qmail/control
debian:/var/qmail/control# openssl genrsa -des3 -out servercert.key.enc 1024
Generating RSA private key, 1024 bit long modulus
...........++++++
.........++++++
e is 65537 (0x10001)
Enter pass phrase for servercert.key.enc:
Verifying - Enter pass phrase for servercert.key.enc:

In the Enter pass phrase for servercert.key.enc I typed twice my encoded key password, any password is good, here though using a stronger one is better.

2. Generate the servercert.key file

debian:/var/qmail/control# openssl rsa -in servercert.key.enc -out servercert.key
Enter pass phrase for servercert.key.enc:
writing RSA key

3. Generate the certificate request

debian:/var/qmail/control# openssl req -new -key servercert.key -out servercert.csr
debian:/var/qmail/control# openssl rsa -in servercert.key.enc -out servercert.key
Enter pass phrase for servercert.key.enc:writing RSA key
root@soccerfame:/var/qmail/control# openssl req -new -key servercert.key -out servercert.csr
You are about to be asked to enter information that will be incorporated
into your certificate request.
What you are about to enter is what is called a Distinguished Name or a DN.
There are quite a few fields but you can leave some blank
For some fields there will be a default value,
If you enter '.', the field will be left blank.
-----
Country Name (2 letter code) [AU]:UK
State or Province Name (full name) [Some-State]:London
Locality Name (eg, city) []:London
Organization Name (eg, company) [Internet Widgits Pty Ltd]:My Company
Organizational Unit Name (eg, section) []:My Org
Common Name (eg, YOUR name) []:
Email Address []:admin@adminmail.com

Please enter the following 'extra' attributes
to be sent with your certificate request
A challenge password []:
An optional company name []:

In the above prompts its necessery to fill in the company name and location, as each of the prompts clearly states.

4. Sign the just generated certificate request

debian:/var/qmail/control# openssl x509 -req -days 9999 -in servercert.csr -signkey servercert.key -out servercert.crt

Notice the option -days 9999 this option instructs the newly generated self signed certificate to be valid for 9999 days which is quite a long time, the reason why the previous generated self signed certificate expired was that it was built for only 365 days

5. Fix the newly generated servercert.pem permissions debian:~# cd /var/qmail/control
debian:/var/qmail/control# chmod 640 servercert.pem
debian:/var/qmail/control# chown vpopmail:vchkpw servercert.pem
debian:/var/qmail/control# cp -f servercert.pem clientcert.pem
debian:/var/qmail/control# chown root:qmail clientcert.pem
debian:/var/qmail/control# chmod 640 clientcert.pem

Finally to load the new certificate, restart of qmail is required:

6. Restart qmail server

debian:/var/qmail/control# qmailctl restart
Restarting qmail:
* Stopping qmail-smtpd.
* Sending qmail-send SIGTERM and restarting.
* Restarting qmail-smtpd.

Test the newly installed certificate

To test the newly installed SSL certificate use the following commands:

debian:~# openssl s_client -crlf -connect localhost:465 -quiet
depth=0 /C=UK/ST=London/L=London/O=My Org/OU=My Company/emailAddress=admin@adminmail.com
verify error:num=18:self signed certificate
verify return:1
...
debian:~# openssl s_client -starttls smtp -crlf -connect localhost:25 -quiet
depth=0 /C=UK/ST=London/L=London/O=My Org/OU=My Company/emailAddress=admin@adminmail.com
verify error:num=18:self signed certificate
verify return:1
250 AUTH LOGIN PLAIN CRAM-MD5
...

If an error is returned like 32943:error:140770FC:SSL routines:SSL23_GET_SERVER_HELLO:unknown protocol:s23_clnt.c:607: this means that SSL variable in the qmail-smtpdssl/run script is set to 0.

To solve this error, change SSL=0 to SSL=1 in /var/qmail/supervise/qmail-smtpdssl/run and do qmailctl restart

The error verify return:1 displayed is perfectly fine and it's more of a warning than an error as it just reports the certificate is self signed.

MySQL: How to check user privileges and allowed hosts to connect with mysql cli

Wednesday, April 2nd, 2014

Reading Time: 3minutes

how-to-check-user-privileges-and-allowed-hosts-to-connect-with-mysql-cli

On a project there are some issues with root admin user unable to access the server from remote host and the most probable reason was there is no access to the server from that host thus it was necessary check mysql root user privilegse and allowed hosts to connect, here SQL query to do it:
 

mysql> select * from `user` where  user like 'root%';
+——————————–+——+——————————————-+————-+————-+————-+————-+————-+———–+————-+—————+————–+———–+————+—————–+————+————+————–+————+———————–+——————+————–+—————–+——————+——————+—————-+———————+——————–+——————+————+————–+———-+————+————-+————–+—————+————-+—————–+———————-+
| Host                           | User | Password                                  | Select_priv | Insert_priv | Update_priv | Delete_priv | Create_priv | Drop_priv | Reload_priv | Shutdown_priv | Process_priv | File_priv | Grant_priv | References_priv | Index_priv | Alter_priv | Show_db_priv | Super_priv | Create_tmp_table_priv | Lock_tables_priv | Execute_priv | Repl_slave_priv | Repl_client_priv | Create_view_priv | Show_view_priv | Create_routine_priv | Alter_routine_priv | Create_user_priv | Event_priv | Trigger_priv | ssl_type | ssl_cipher | x509_issuer | x509_subject | max_questions | max_updates | max_connections | max_user_connections |
+——————————–+——+——————————————-+————-+————-+————-+————-+————-+———–+————-+—————+————–+———–+————+—————–+————+————+————–+————+———————–+——————+————–+—————–+——————+——————+—————-+———————+——————–+——————+————+————–+———-+————+————-+————–+—————+————-+—————–+———————-+
| localhost                      | root | *5A07790DCF43AC89820F93CAF7B03DE3F43A10D9 | Y           | Y           | Y           | Y           | Y           | Y         | Y           | Y             | Y            | Y         | Y          | Y               | Y          | Y          | Y            | Y          | Y                     | Y                | Y            | Y               | Y                | Y                | Y              | Y                   | Y                  | Y                | Y          | Y            |          |            |             |              |             0 |           0 |               0 |                    0 |
| server737                        | root | *5A07790DCF43AC89820F93CAF7B03DE3F43A10D9 | Y           | Y           | Y           | Y           | Y           | Y         | Y           | Y             | Y            | Y         | Y          | Y               | Y          | Y          | Y            | Y          | Y                     | Y                | Y            | Y               | Y                | Y                | Y              | Y                   | Y                  | Y                | Y          | Y            |          |            |             |              |             0 |           0 |               0 |                    0 |
| 127.0.0.1                      | root | *5A07790DCF43AC89820F93CAF7B03DE3F43A10D9 | Y           | Y           | Y           | Y           | Y           | Y         | Y           | Y             | Y            | Y         | Y          | Y               | Y          | Y          | Y            | Y          | Y                     | Y                | Y            | Y               | Y                | Y                | Y              | Y                   | Y                  | Y                | Y          | Y            |          |            |             |              |             0 |           0 |               0 |                    0 |
| server737.server.myhost.net | root | *5A07790DCF43FC89820A93CAF7B03DE3F43A10D9 | Y           | Y           | Y           | Y           | Y           | Y         | Y           | Y             | Y            | Y         | Y          | Y               | Y          | Y          | Y            | Y          | Y                     | Y                | Y            | Y               | Y                | Y                | Y              | Y                   | Y                  | Y                | Y          | Y            |          |            |             |              |             0 |           0 |               0 |                    0 |
| server4586                        | root | *5A07790DCF43AC89820F93CAF7B03DE3F43A10D9 | N           | N           | N           | N           | N           | N         | N           | N             | N            | N         | N          | N               | N          | N          | N            | N          | N                     | N                | N            | N               | N                | N                | N              | N                   | N                  | N                | N          | N            |          |            |             |              |             0 |           0 |               0 |                    0 |
| server4586.myhost.net              | root | *5A07790DCF43AC89820F93CAF7B03DE3F43A10D9 | N           | N           | N           | N           | N           | N         | N           | N             | N            | N         | N          | N               | N          | N          | N            | N          | N                     | N                | N            | N               | N                | N                | N              | N                   | N                  | N                | N          | N            |          |            |             |              |             0 |           0 |               0 |                    0 |
+——————————–+——+——————————————-+————-+————-+————-+————-+————-+———–+————-+—————+————–+———–+————+—————–+————+————+————–+————+———————–+——————+————–+—————–+——————+——————+—————-+———————+——————–+——————+————+————–+———-+————+————-+————–+—————+————-+—————–+———————-+
6 rows in set (0.00 sec)

mysql> exit


Here is query explained:

select * from `user` where  user like 'root%'; query means:

select * – show all
from `user` – from user database
where user like 'root%' – where there is match in user column to any string starting with 'root*',
 

‘host-name’ is blocked because of many connection errors; unblock with ‘mysqladmin flush-hosts’

Sunday, May 20th, 2012

Reading Time: 3minutes

mysql-logo-host-name-blocked-because-of-many-connection-errors
My home run machine MySQL server was suddenly down as I tried to check my blog and other sites today, the error I saw while trying to open, this blog as well as other hosted sites using the MySQL was:

Error establishing a database connection

The topology, where this error occured is simple, I have two hosts:

1. Apache version 2.0.64 compiled support externally PHP scripts interpretation via libphp – the host runs on (FreeBSD)

2. A Debian GNU / Linux squeeze running MySQL server version 5.1.61

The Apache host is assigned a local IP address 192.168.0.1 and the SQL server is running on a host with IP 192.168.0.2

To diagnose the error I've logged in to 192.168.0.2 and weirdly the mysql-server was appearing to run just fine:
 

debian:~# ps ax |grep -i mysql
31781 pts/0 S 0:00 /bin/sh /usr/bin/mysqld_safe
31940 pts/0 Sl 12:08 /usr/sbin/mysqld –basedir=/usr –datadir=/var/lib/mysql –user=mysql –pid-file=/var/run/mysqld/mysqld.pid –socket=/var/run/mysqld/mysqld.sock –port=3306
31941 pts/0 S 0:00 logger -t mysqld -p daemon.error
32292 pts/0 S+ 0:00 grep -i mysql

Moreover I could connect to the localhost SQL server with mysql -u root -p and it seemed to run fine. The error Error establishing a database connection meant that either something is messed up with the database or 192.168.0.2 Mysql port 3306 is not properly accessible.

My first guess was something is wrong due to some firewall rules, so I tried to connect from 192.168.0.1 to 192.168.0.2 with telnet:
 

freebsd# telnet 192.168.0.2 3306
Trying 192.168.0.2…
Connected to jericho.
Escape character is '^]'.
Host 'webserver' is blocked because of many connection errors; unblock with 'mysqladmin flush-hosts'
Connection closed by foreign host.

Right after the telnet was initiated as I show in the above output the connection was immediately closed with the error:

Host 'webserver' is blocked because of many connection errors; unblock with 'mysqladmin flush-hosts'Connection closed by foreign host.

In the error 'webserver' is my Apache machine set hostname. The error clearly states the problems with the 'webserver' apache host unable to connect to the SQL database are due to 'many connection errors' and a fix i suggested with mysqladmin flush-hosts

To temporary solve the error and restore my normal connectivity between the Apache and the SQL servers I logged I had to issue on the SQL host:

mysqladmin -u root -p flush-hostsEnter password:

Thogh this temporar fix restored accessibility to the databases and hence the websites errors were resolved, this doesn't guarantee that in the future I wouldn't end up in the same situation and therefore I looked for a permanent fix to the issues once and for all.

The permanent fix consists in changing the default value set for max_connect_error in /etc/mysql/my.cnf, which by default is not too high. Therefore to raise up the variable value, added in my.cnf in conf section [mysqld]:

debian:~# vim /etc/mysql/my.cnf
...
max_connect_errors=4294967295

and afterwards restarted MYSQL:

debian:~# /etc/init.d/mysql restart
Stopping MySQL database server: mysqld.
Starting MySQL database server: mysqld.
Checking for corrupt, not cleanly closed and upgrade needing tables..

To make sure the assigned max_connect_errors=4294967295 is never reached due to Apache to SQL connection errors, I've also added as a cronjob.

debian:~# crontab -u root -e
00 03 * * * mysqladmin flush-hosts

In the cron I have omitted the mysqladmin -u root -p (user/pass) input options because for convenience I have already stored the mysql root password in /root/.my.cnf

Here is how /root/.my.cnf looks like:

debian:~# cat /root/.my.cnf
[client]
user=root
password=a_secret_sql_password

Now hopefully, this would permanently solve SQL's 'failure to accept connections' due to too many connection errors for future.

How to make a mirror of website on GNU / Linux with wget / Few tips on wget site mirroring

Wednesday, February 22nd, 2012

Reading Time: 4minutes

how-to-make-mirror-of-website-on-linux-wget

Everyone who used Linux is probably familiar with wget or has used this handy download console tools at least thousand of times. Not so many Desktop GNU / Linux users like Ubuntu and Fedora Linux users had tried using wget to do something more than single files download.
Actually wget is not so popular as it used to be in earlier linux days. I've noticed the tendency for newer Linux users to prefer using curl (I don't know why).

With all said I'm sure there is plenty of Linux users curious on how a website mirror can be made through wget.
This article will briefly suggest few ways to do website mirroring on linux / bsd as wget is both available on those two free operating systems.

1. Most Simple exact mirror copy of website

The most basic use of wget's mirror capabilities is by using wget's -mirror argument:

# wget -m http://website-to-mirror.com/sub-directory/

Creating a mirror like this is not a very good practice, as the links of the mirrored pages will still link to external URLs. In other words link URL will not pointing to your local copy and therefore if you're not connected to the internet and try to browse random links of the webpage you will end up with many links which are not opening because you don't have internet connection.

2. Mirroring with rewritting links to point to localhost and in between download page delay

Making mirror with wget can put an heavy load on the remote server as it fetches the files as quick as the bandwidth allows it. On heavy servers rapid downloads with wget can significantly reduce the download server responce time. Even on a some high-loaded servers it can cause the server to hang completely.
Hence mirroring pages with wget without explicity setting delay in between each page download, could be considered by remote server as a kind of DoS – (denial of service) attack. Even some site administrators have already set firewall rules or web server modules configured like Apache mod_security which filter requests to IPs which are doing too frequent HTTP GET /POST requests to the web server.
To make wget delay with a 10 seconds download between mirrored pages use:

# wget -mk -w 10 -np --random-wait http://website-to-mirror.com/sub-directory/

The -mk stands for -m/-mirror and -k / shortcut argument for –convert-links (make links point locally), –random-wait tells wget to make random waits between o and 10 seconds between each page download request.

3. Mirror / retrieve website sub directory ignoring robots.txt "mirror restrictions"

Some websites has a robots.txt which restricts content download with clients like wget, curl or even prohibits, crawlers to download their website pages completely.

/robots.txt restrictions are not a problem as wget has an option to disable robots.txt checking when downloading.
Getting around the robots.txt restrictions with wget is possible through -e robots=off option.
For instance if you want to make a local mirror copy of the whole sub-directory with all links and do it with a delay of 10 seconds between each consequential page request without reading at all the robots.txt allow/forbid rules:

# wget -mk -w 10 -np -e robots=off --random-wait http://website-to-mirror.com/sub-directory/

4. Mirror website which is prohibiting Download managers like flashget, getright, go!zilla etc.

Sometimes when try to use wget to make a mirror copy of an entire site domain subdirectory or the root site domain, you get an error similar to:

Sorry, but the download manager you are using to view this site is not supported.
We do not support use of such download managers as flashget, go!zilla, or getright

This message is produced by the site dynamic generation language PHP / ASP / JSP etc. used, as the website code is written to check on the browser UserAgent sent.
wget's default sent UserAgent to the remote webserver is:
Wget/1.11.4

As this is not a common desktop browser useragent many webmasters configure their websites to only accept well known established desktop browser useragents sent by client browsers.
Here are few typical user agents which identify a desktop browser:
 

  • Mozilla/5.0 (Windows NT 6.1; rv:6.0) Gecko/20110814 Firefox/6.0
  • Mozilla/5.0 (X11; Linux i686; rv:6.0) Gecko/20100101 Firefox/6.0
  • Mozilla/6.0 (Macintosh; I; Intel Mac OS X 11_7_9; de-LI; rv:1.9b4) Gecko/2012010317 Firefox/10.0a4
  • Mozilla/5.0 (Windows NT 6.1; Win64; x64; rv:2.2a1pre) Gecko/20110324 Firefox/4.2a1pre

etc. etc.

If you're trying to mirror a website which has implied some kind of useragent restriction based on some "valid" useragent, wget has the -U option enabling you to fake the useragent.

If you get the Sorry but the download manager you are using to view this site is not supported , fake / change wget's UserAgent with cmd:

# wget -mk -w 10 -np -e robots=off \
--random-wait
--referer="http://www.google.com" \--user-agent="Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.8.1.6) Gecko/20070725 Firefox/2.0.0.6" \--header="Accept:text/xml,application/xml,application/xhtml+xml,text/html;q=0.9,text/plain;q=0.8,image/png,*/*;q=0.5" \--header="Accept-Language: en-us,en;q=0.5" \--header="Accept-Encoding: gzip,deflate" \--header="Accept-Charset: ISO-8859-1,utf-8;q=0.7,*;q=0.7" \--header="Keep-Alive: 300"

For the sake of some wget anonimity – to make wget permanently hide its user agent and pretend like a Mozilla Firefox running on MS Windows XP use .wgetrc like this in home directory.

5. Make a complete mirror of a website under a domain name

To retrieve complete working copy of a site with wget a good way is like so:

# wget -rkpNl5 -w 10 --random-wait www.website-to-mirror.com

Where the arguments meaning is:
-r – Retrieve recursively
-k – Convert the links in documents to make them suitable for local viewing
-p – Download everything (inline images, sounds and referenced stylesheets etc.)
-N – Turn on time-stamping
-l5 – Specify recursion maximum depth level of 5

6. Make a dynamic pages static site mirror, by converting CGI, ASP, PHP etc. to HTML for offline browsing

It is often websites pages are ending in a .php / .asp / .cgi … extensions. An example of what I mean is for instance the URL http://php.net/manual/en/tutorial.php. You see the url page is tutorial.php once mirrored with wget the local copy will also end up in .php and therefore will not be suitable for local browsing as .php extension is not understood how to interpret by the local browser.
Therefore to copy website with a non-html extension and make it offline browsable in HTML there is the –html-extension option e.g.:

# wget -mk -w 10 -np -e robots=off \
--random-wait \
--convert-links http://www.website-to-mirror.com

A good practice in mirror making is to set a download limit rate. Setting such rate is both good for UP and DOWN side (the local host where downloading and remote server). download-limit is also useful when mirroring websites consisting of many enormous files (documental movies, some music etc.).
To set a download limit to add –limit-rate= option. Passing by to wget –limit-rate=200K would limit download speed to 200KB.

Other useful thing to assure wget has made an accurate mirror is wget logging. To use it pass -o ./my_mirror.log to wget.
 

Create SSH Tunnel to MySQL server to access remote filtered MySQL port 3306 host through localhost port 3308

Friday, February 27th, 2015

Reading Time: 3minutes

create_ssh_tunnel_to-mysql_server-to-access-remote-filtered-mysql-on-port-3306-secure_ssh_traffic
On our Debian / CentOS / Ubuntu Linux and Windows servers we're running multiple MySQL servers and our customers sometimes need to access this servers.
This is usually problem because MySQL Db  servers are running in a DMZ Zone with a strong firewall and besides that for security reasons SQLs are configured to only listen for connections coming from localhost, I mean in config files across our Debian Linux servers and CentOS / RHEL Linux machines the /etc/mysql/my.cnf and /etc/my.cnf the setting for bind-address is 127.0.0.1:
 

[root@centos ~]# grep -i bind-address /etc/my.cnf 
bind-address            = 127.0.0.1
##bind-address  = 0.0.0.0


For source code developers which are accessing development SQL servers only through a VPN secured DMZ Network there are few MySQL servers witha allowed access remotely from all hosts, e.g. on those I have configured:
 

[root@ubuntu-dev ~]# grep -i bind-address /etc/my.cnf 

bind-address  = 0.0.0.0


However though clients insisted to have remote access to their MySQL Databases but since this is pretty unsecure, we decided not to configure MySQLs to listen to all available IP addresses / network interfaces. 
MySQl acess is allowed only through PhpMyAdmin accessible via Cleint's Web interface which on some servers is CPanel  and on other Kloxo (This is open source CPanel like very nice webhosting platform).

For some stubborn clients which wanted to have a mysql CLI and MySQL Desktop clients access to be able to easily analyze their databases with Desktop clients such as MySQL WorkBench there is a "hackers" like work around to create and use a MySQL Tunnel to SQL server from their local Windows PCs using standard OpenSSH Linux Client from Cygwin,  MobaXterm which already comes with the SSH client pre-installed and has easy GUI interface to create SSH tunnels or eventually use Putty's Plink (Command Line Interface) to create the tunnel

Anyways the preferred and recommended (easiest) way to achieve a tunnel between MySQL and local PC (nomatter whether Windows or Linux client system) is to use standard ssh client and below command:
 

ssh -o ServerAliveInterval=10 -M -T -M -N -L 3308:localhost:3306 your-server.your-domain.com


By default SSH tunnel will keep opened for 3 minutes and if not used it will automatically close to get around this issue, you might want to raise it to (lets say 15 minutes). To do so in home directory user has to add in:
 

~/.ssh/config

ServerAliveInterval 15
ServerAliveCountMax 4


Note that sometimes it is possible ven though ssh tunnel timeout value is raised to not take affect if there is some NAT (Network Adress Translation) with low timeout setting on a firewall level. If you face constant SSH Tunnel timeouts you can use below bash few lines code to auto-respawn SSH tunnel connection (for Windows users use MobaXterm or install in advance bash shell cygwin package):
 

while true
do
 
ssh -o ServerAliveInterval=10 -M -T -M -N -L 3308:localhost:3306 your-server.your-domain.com
  sleep 15
done


Below is MySQLBench screenshot connected through server where this blog is located after establishing ssh tunnel to remote mysql server on port 3308 on localhost

mysql-workbench-database-analysis-and-management-gui-tool-convenient-for-data-migratin-and-queries-screenshot-

There is also another alternative way to access remote firewall filtered mysql servers without running complex commands to Run a tunnel which we recommend for clients (sql developers / sql designers) by using HeidiSQL (which is a useful tool for webdevelopers who has to deal with MySQL and MSSQL hosted Dbs).

heidisql-show-host_processlist-screenshot

To connect to remote MySQL server through a Tunnel using Heidi:

mysql_connection_configuration-heidi-mysql-gui-connect-tool

 

In the ‘Settings’ tab

1. In the dropdown list of ‘Network type’, please select SSH tunnel

2. Hostname/IP: localhost (even you are connecting remotely)

3. Username & Password: your mysql user and password

Next, in the tab SSH Tunnel:

1. specify plink.exe or you need to download it and specify where it’s located

2. Host + port: the remote IP of your SSH server(should be MySQL server as well), port 22 if you don’t change anything

3. Username & password: SSH username (not MySQL user)

 

heidi-connection_ssh_tunnel_configuration-heidi-sql-tool-screenshot