Posts Tagged ‘www’

Create and Configure SSL bundle file for GoGetSSL issued certificate in Apache Webserver on Linux

Saturday, November 3rd, 2018

gogetssl-install-certificate-on-linux-howto-sslcertificatechainfile-obsolete

I had a small task to configure a new WildCard SSL for domains on a Debian GNU / Linux Jessie running Apache 2.4.25.

The official documentation on how to install the SSL certificate on Linux given by GoGetSSL (which is by COMODO was obsolete as of time of writting this article and suggested as install instructions:
 

SSLEngine on
SSLCertificateKeyFile /etc/ssl/ssl.key/server.key
SSLCertificateFile /etc/ssl/ssl.crt/yourDomainName.crt
SSLCertificateChainFile /etc/ssl/ssl.crt/yourDomainName.ca-bundle


Adding such configuration to domain Vhost and testing with apache2ctl spits an error like:

 

root@webserver:~# apache2ctl configtest
AH02559: The SSLCertificateChainFile directive (/etc/apache2/sites-enabled/the-domain-name-ssl.conf:17) is deprecated, SSLCertificateFile should be used instead
Syntax OK

 


To make issued GoGetSSL work with Debian Linux, hence, here is the few things done:

The files issued by Gogetssl.COM were the following:

 

AddTrust_External_CA_Root.crt
COMODO_RSA_Certification_Authority.crt
the-domain-name.crt


The webserver had already SSL support via mod_ssl Apache module, e.g.:

 

root@webserver:~# ls -al /etc/apache2/mods-available/*ssl*
-rw-r–r– 1 root root 3112 окт 21  2017 /etc/apache2/mods-available/ssl.conf
-rw-r–r– 1 root root   97 сеп 19  2017 /etc/apache2/mods-available/ssl.load
root@webserver:~# ls -al /etc/apache2/mods-enabled/*ssl*
lrwxrwxrwx 1 root root 26 окт 19  2017 /etc/apache2/mods-enabled/ssl.conf -> ../mods-available/ssl.conf
lrwxrwxrwx 1 root root 26 окт 19  2017 /etc/apache2/mods-enabled/ssl.load -> ../mods-available/ssl.load


For those who doesn't have mod_ssl enabled, to enable it quickly run:

 

# a2enmod ssl


The VirtualHost used for the domains had Apache config as below:

 

 

 

NameVirtualHost *:443

<VirtualHost *:443>
    ServerAdmin support@the-domain-name.com
    ServerName the-domain-name.com
    ServerAlias *.the-domain-name.com the-domain-name.com

    DocumentRoot /home/the-domain-namecom/www
    SSLEngine On
#    <Directory />
#        Options FollowSymLinks
#        AllowOverride None
#    </Directory>
    <Directory /home/the-domain-namecom/www>
        Options Indexes FollowSymLinks MultiViews
        AllowOverride None
        Include /home/the-domain-namecom/www/htaccess_new.txt
        Order allow,deny
        allow from all
    </Directory>

    ScriptAlias /cgi-bin/ /usr/lib/cgi-bin/
    <Directory "/usr/lib/cgi-bin">
        AllowOverride None
        Options +ExecCGI -MultiViews +SymLinksIfOwnerMatch
        Order allow,deny
        Allow from all
    </Directory>

    ErrorLog ${APACHE_LOG_DIR}/error.log

    # Possible values include: debug, info, notice, warn, error, crit,
    # alert, emerg.
    LogLevel warn

    CustomLog ${APACHE_LOG_DIR}/access.log combined

#    Alias /doc/ "/usr/share/doc/"
#   <Directory "/usr/share/doc/">
#       Options Indexes MultiViews FollowSymLinks
#       AllowOverride None
#       Order deny,allow
#       Deny from all
#       Allow from 127.0.0.0/255.0.0.0 ::1/128
#   </Directory>
SSLCertificateKeyFile /etc/apache2/ssl/the-domain-name.com.key
SSLCertificateFile /etc/apache2/ssl/chain.crt

 

</VirtualHost>

The config directives enabling and making the SSL actually work are:
 

SSLEngine On
SSLCertificateKeyFile /etc/apache2/ssl/the-domain-name.com.key
SSLCertificateFile /etc/apache2/ssl/chain.crt

 

The chain.crt file is actually a bundle file containing a bundle of the gogetssl CA_ROOT and RSA_Certification_Authority 3 files, to prepare that file, I've used bundle.sh small script found on serverfault.com here I've made a mirror of bundle.sh on www.pc-freak.net here   the script content is as follows:

To prepare the chain.crt  bundle, I ran:

 

sh create-ssl-bundle.sh _iq-test_cc.crt chain.crt
sh create-ssl-bundle.sh _iq-test_cc.crt >chain.crt
sh create-ssl-bundle.sh COMODO_RSA_Certification_Authority.crt >> chain.crt
sh create-ssl-bundle.sh bundle.sh AddTrust_External_CA_Root.crt >> chain.crt


Then I copied the file to /etc/apache2/ssl together with the-domain-name.com.key file earlier generated using openssl command earlier explained in my article how to install RapidSSL certificate on Linux

/etc/apache2/ssl was not previously existing (on Debian Linux), so to create it:

 

root@webserver:~# mkdir /etc/apache2/ssl
root@webserver:~# ls -al /etc/apache2/ssl/chain.crt
-rw-r–r– 1 root root 20641 Nov  2 12:27 /etc/apache2/ssl/chain.crt
root@webserver:~# ls -al /etc/apache2/ssl/the-domain-name.com.key
-rw-r–r– 1 root root 6352 Nov  2 20:35 /etc/apache2/ssl/the-domain-name.com.key

 

As I needed to add the SSL HTTPS configuration for multiple domains, further on I've wrote and used a tiny shell script add_new_vhost.sh which accepts as argument the domain name I want to add. The script works with a sample Skele (Template) file, which is included in the script itself and can be easily modified for the desired vhost config.
To add my multiple domains, I've used the script as follows:
 

sh add_new_vhost.sh add-new-site-domain.com
sh add_new_vhost.sh add-new-site-domain1.com


etc.

Here is the complete script as well:

 

#!/bin/sh
# Shell script to add easily new domains for virtual hosting on Debian machines
# arg1 should be a domain name
# This script takes the domain name which you type as arg1 uses it and creates
# Docroot / cgi-bin directory for the domain, create seperate site's apache log directory
# then takes a skele.com file and substitutes a skele.com with your domain name and directories
# This script's aim is to easily enable sysadmin to add new domains in Debian
sites_base_dir=/var/www/jail/home/www-data/sites/;
# the directory where the skele.com file is
skele_dir=/etc/apache2/sites-available;
# base directory where site log dir to be created
cr_sep_log_file_d=/var/log/apache2/sites;
# owner of the directories
username='www-data';
# read arg0 and arg1
arg0=$0;
arg1=$1;
if [[ -z $arg1 ]]; then
echo "Missing domain name";
exit 1;
fi

 

# skele template
echo "#
#  Example.com (/etc/apache2/sites-available/www.skele.com)
#
<VirtualHost *>
        ServerAdmin admin@design.bg
        ServerName  skele.com
        ServerAlias www.skele.com


        # Indexes + Directory Root.
        DirectoryIndex index.php index.htm index.html index.pl index.cgi index.phtml index.jsp index.py index.asp

        DocumentRoot /var/www/jail/home/www-data/sites/skelecom/www/docs
        ScriptAlias /cgi-bin "/var/www/jail/home/www-data/sites/skelecom/cgi-bin"
        
        # Logfiles
        ErrorLog  /var/log/apache2/sites/skelecom/error.log
        CustomLog /var/log/apache2/sites/skelecom/access.log combined
#       CustomLog /dev/null combined
      <Directory /var/www/jail/home/www-data/sites/skelecom/www/docs/>
                Options FollowSymLinks MultiViews -Includes
                AllowOverride None
                Order allow,deny
                allow from all
                # This directive allows us to have apache2's default start page
                # in /apache2-default/, but still have / go to the right place
#               RedirectMatch ^/$ /apache2-default/
        </Directory>

        <Directory /var/www/jail/home/www-data/sites/skelecom/www/docs/>
                Options FollowSymLinks ExecCGI -Includes
                AllowOverride None
                Order allow,deny
                allow from all
        </Directory>

</VirtualHost>
" > $skele_dir/skele.com;

domain_dir=$(echo $arg1 | sed -e 's/\.//g');
new_site_dir=$sites_base_dir/$domain_dir/www/docs;
echo "Creating $new_site_dir";
mkdir -p $new_site_dir;
mkdir -p $sites_base_dir/cgi-bin;
echo "Creating sites's Docroot and CGI directory";
chown -R $username:$username $new_site_dir;
chown -R $username:$username $sites_base_dir/cgi-bin;
echo "Creating site's Log files Directory";
mkdir -p $cr_sep_log_file_d/$domain_dir;
echo "Creating sites's VirtualHost file and adding it for startup";
sed -e "s#skele.com#$arg1#g" -e "s#skelecom#$domain_dir#g" $skele_dir/skele.com >> $skele_dir/$arg1;
ln -sf $skele_dir/$arg1 /etc/apache2/sites-enabled/;
echo "All Completed please restart apache /etc/init.d/apache restart to Load the new virtual domain";

# Date Fri Jan 11 16:27:38 EET 2008


Using the script saves a lot of time to manually, copy vhost file and then edit it to change ServerName directive, for vhosts whose configuration is identical and only the ServerName listener has to change, it is perfect to create all necessery domains, I've created a simple text file with each of the domains and run it in a loop:
 

while :; do sh add_new_vhost.sh $i; done < domain_list.txt
 

 

Install and use personal Own Cloud on Debian Linux for better shared data security – OwnCloud a Free Software replacement for Google Drive

Thursday, August 23rd, 2018

owncloud-self-hosted-cloud-file-sharing-and-storage-service-for-gnu-linux-howto-install-on-debian

Basicly I am against the use of any Cloud type of service but as nowadays Cloud usage is almost inevitable and most of the times you need some kind of service to store and access remotely your Data from multiple devices such as DropBox, Google Drive, iCloud etc. and using some kind of infrastructure to execute high-performance computing is invitable just like the Private Cloud paid services online are booming nowdays, I decided to give a to research and test what is available as a free software in the field of Clouding (your data) 🙂

Undoubfully, it is really nice fact that there are Free Software / Open Source alternatives to run your Own personal Cloud to store your data from multiple locations on a single point.

The most popular and leading Cloud Collaboration service (which is OpenSource but unfortunately not under GPLv2 / GPV3 – e.g. not fully free software) is OwnCloud.

ownCloud is a flexible self-hosted PHP and Javascript based web application used for data synchronization and file sharing (where its remote file access capabilites are realized by Sabre/Dav an open source WebDav server.
OwnCloud allows end user to easily Store / Manage files, Calendars, Contacts, To-Do lists (user and group administration via OpenID and LDAP), public URLs can be easily, created, the users can interact with browser-based ODF (Open Document Format) word processor , there is a Bookmarking, URL Shortening service integrated, Gallery RSS Feed and Document Viewer tools such as PDF viewer etc. which makes it a great alternative to the popular Google Drive, iCloud, DropBox etc.

The main advantage of using a self-hosted Cloud is that Your data is hosted and managed by you (on your server and your hard drives) and not by some God knows who third party provider such as the upmentioned.
In other words by using OwnCloud you manage your own data and you don't share it ot on demand with the Security Agencies with CIA, MI6, Mussad … (as it is very likely most of publicly offered Cloud storage services keeps track on the data stored on them).

The other disadvantage of Cloud Computing is that the stored data on such is usually stored on multiple servers and you can never know for sure where your data is physically located, which in my opinion is way worse than the option with Self Hosted Cloud where you know where your data belongs and you can do whatever you want with your data keep it secret / delete it or share it on your demand.

OwnCloud has its clients for most popular Mobile (Smart Phone) platforms – an Android client is available in Google Play Store as well as in Apple iTunes besides the clients available for FreeBSD OS, the GNOME desktop integration package and Raspberry Pi.

For those who are looking for additional advanced features an Enterprise version of OwnCloud is also available aiming business use and included software support.

Assuming you have a homebrew server or have hired a dedidacted or VPS server (such as the Ones we provide) ,Installing OwnCloud on GNU / Linux is a relatively easy
task and it will take no more than 15 minutes to 2 hours of your life.
In that article I am going to give you a specific instructions on how to install on Debian GNU / Linux 9 but installing on RPM based distros is similar and straightfoward process.
 

1. Install MySQL / MariaDB database server backend
 

By default OwnCloud does use SQLite as a backend data storage but as SQLite stores its data in a file and is becoming quickly slow, is generally speaking slowre than relational databases such as MariaDB server (or the now almost becoming obsolete MySQL Community server).
Hence in this article I will explain how to install OwnCloud with MariaDB as a backend.

If you don't have it installed already, e.g. it is a new dedicated server install MariaDB with:
 

server:~# apt-get install –yes mariadb-server


Assuming you're install on a (brand new fresh Linux install – you might want to install also the following set of tools / services).

 

server:~# systemctl start mariadb
server:~# systemctl enable mariadb
server:~# mysql_secure_installation


mysql_secure_installation – is to finalize and secure MariaDB installation and set the root password.
 

2. Create necessery database and users for OwnCloud to the database server
 

linux:~# mysql -u root -p
MariaDB [(none)]> CREATE DATABASE owncloud CHARACTER SET utf8;
MariaDB [(none)]> GRANT ALL PRIVILEGES ON owncloud.* TO 'owncloud'@'localhost' IDENTIFIED BY 'owncloud_passwd';
MariaDB [(none)]> FLUSH PRIVILEGES;
MariaDB [(none)]> \q

 

3. Install Apache + PHP necessery deb packages
 

As of time of writting the article on Debian 9.0 the required packages for a working Apache + PHP install for OwnCloud are as follows.

 

server:~# apt-get install –yes apache2 mariadb-server libapache2-mod-php7.0 \
openssl php-imagick php7.0-common php7.0-curl php7.0-gd \
php7.0-imap php7.0-intl php7.0-json php7.0-ldap php7.0-mbstring \
php7.0-mcrypt php7.0-mysql php7.0-pgsql php-smbclient php-ssh2 \
php7.0-sqlite3 php7.0-xml php7.0-zip php-redis php-apcu

 

4. Install Redis to use as a Memory Cache for accelerated / better performance ownCloud service


Redis is an in-memory kept key-value database that is similar to Memcached so OwnCloud could use it to cache stored data files. To install latest redis-server on Debian 9:
 

server:~# apt-get install –yes redis-server

5. Install ownCloud software packages on the server

Unfortunately, default package repositories on Debian 9 does not provide owncloud server packages but only some owncloud-client packages are provided, that's perhaps the packages issued by owncloud does not match debian packages.

As of time of writting this article, the latest available OwnCloud server  version package for Debian is OC 10.

a) Add necessery GPG keys

The repositories to use are provided by owncloud.org, to use them we need to first add the necessery gpg key to verify the binaries have a legit checksum.
 

server:~# wget -qO- https://download.owncloud.org/download/repositories/stable/Debian_9.0/Release.key | sudo apt-key add –

 

b) Add owncloud.org repositories in separete sources.list file

 

server:~# echo 'deb https://download.owncloud.org/download/repositories/stable/Debian_9.0/ /' | sudo tee /etc/apt/sources.list.d/owncloud.list

 

c) Enable https transports for the apt install tool

 

server:~# apt-get –yes install apt-transport-https

 

d) Update Debian apt cache list files and install the pack

 

server:~# apt-get update

 

server:~# apt-get install –yes owncloud-files

 

By default owncloud store file location is /var/www/owncloud but on many servers that location is not really appropriate because /var/www might be situated on a hard drive partition whose size is not big enough, if that's the case just move the folder to another partition and create a symbolic link in /var/www/owncloud pointing to it …


6. Create necessery Apache configurations to make your new self-hosted cloud accessible
 

a) Create Apache config file

 

server:~# vim /etc/apache2/sites-available/owncloud.conf

 

 

Alias /owncloud "/var/www/owncloud/"

<Directory /var/www/owncloud/>
Options +FollowSymlinks
AllowOverride All

<IfModule mod_dav.c>
Dav off
</IfModule>

SetEnv HOME /var/www/owncloud
SetEnv HTTP_HOME /var/www/owncloud

</Directory>

b) Enable Mod_Dav (WebDAV) if it is not enabled yet

 

server:~# ln -sf ../mods-available/dav_fs.conf
server:~# ln -sf ../mods-available/dav_fs.load
server:~# ln -sf ../mods-available/dav.load
server:~# ln -sf ../mods-available/dav_lock.load

c) Set proper permissions for /var/www/owncloud to make upload work properly

 

chown -R www-data: /var/www/owncloud/


d) Restart Apache WebServer (to make new configuration affective)

 

 

server:~# /etc/init.d/apache2 restart


7. Finalize  OwnCloud Install
 

Access OwnCloud Web Interface to finish the database creation and set the administrator password for the New Self-Hosted cloud
 

http://Your_server_ip_address/owncloud/

By default the Web interface is accessible in unencrypted (insecure) http:// it is a recommended practice (if you already don't have an HTTPS SSL certificate install for the IP or the domain to install one either a self-signed certificate or even better to use LetsEncrypt CertBot to easily create a valid SSL for free for your domain

 

installing-OwnCloud-Web-Config-User-Pass-interface-Owncloud-10-on-Debian-9-Linux-howto

Just fill in in your desired user / pass and pass on the database user / password / db name (if required you can set also a different location for the data directory from the default one /var/www/owncloud/data.

Click Finish Setup and That's all folks!

owncloud-server-web-ui-interface

OwnCloud is successfully installed on the server, you can now go and download a Mobile App or Desktop application for whatever OS you're using and start using it as a Dropbox replacement. In a certain moment you might want to consult also the official UserManual documentation as you would probably need further information on how to manage your owncloud.

Enjoy !

How to install / add new root certificates on Debian, Ubuntu, Mint Linux

Saturday, October 21st, 2017

add-install-new-root-ca-certificates-to-debian-ubuntu-linux-howto

How to add / Installing a root/CA Certificate on Debian, Ubuntu, Mint Linux

 


 Because of various auditing failures and other security issues, the CAcert root certificate set is slowly disappearing from the Ubuntu and Debian ‘ca-certificates’ package.

That's really tricky because if you're a system administrator or have a bunch of programmers whose needs is to install a new set of root certificates for their freshly develped Application or you have to make a corporate certificates added to debian rootca, then the good news is it is quite easy to install new certificates to deb based distributions.

 

Given a CA certificate file foo.crt, follow these steps to install it on Debian / Ubuntu:

    Create a directory for extra CA certificates in /usr/share/ca-certificates:
 

 

    debian:~# mkdir /usr/share/ca-certificates/extra-certificates

 

    Copy the CA .crt file to this directory:
 

 

    debian:~# cp foo.crt /usr/share/ca-certificates/extra-certificates/foo.crt

 

    Let Debian / Ubuntu add the .crt file's path relative to /usr/share/ca-certificates to /etc/ca-certificates.conf (the file lists certificates that you wish to use or to ignore to be installed in /etc/ssl/certs)
 

 

    debian:~# dpkg-reconfigure ca-certificates

 

In case you want to include a .pem file to the list of trustable certificates on Debian / Ubuntu, it must first be converted to a .crt file first, you can do that with:
 

 

    debian:~# openssl x509 -in foo.pem -inform PEM -out foo.crt

 


Lets say you want to add some custom Root certificate for exapmle cacert.org

 

 

 

   debian:~# mkdir /usr/local/share/ca-certificates/cacert.org
   debian:~# cd /usr/local/share/ca-certificates/cacert.org
   debian:~# mkdir /usr/local/share/ca-certificates/cacert.org
   debian:~# wget -P /usr/local/share/ca-certificates/cacert.org http://www.cacert.org/certs/root.crt http://www.cacert.org/certs/class3.crt

 

 

 

Then once again update the ca certificates bundle

   debian:~# update-ca-certificates

 

OSCommerce how to change / reset lost admin password

Monday, October 16th, 2017

reset-forgotten-lost-oscommerce-password-howto-Os_commerce-logo.svg

How to change / reset OSCommerce lost / forgotten admin password?

The password in OSCommerce is kept in table "admin", so to reset password connect to MySQL with mysql cli client.

First thing to do is to generate the new hash string, you can do that with a simple php script using the md5(); function

 

root@pcfreak:/var/www/files# cat 1.php
<?
$pass=md5('password');
echo $pass;
?>

 

root@pcfreak:/var/www/files# php 1.php
5f4dcc3b5aa765d61d8327deb882cf99
root@pcfreak:/var/www/files#

 

Our just generated string (for text password password) is hash: 5f4dcc3b5aa765d61d8327deb882cf99

Next to update the new hash string into SQL, we connect to MySQL:

 

$ mysql -u root -p

 


And issue following command to modify the encrypted hash string:

 

UPDATE `DB`.`admin` SET `admin_password` = '5f4dcc3b5aa765d61d8327deb882cf99' WHERE `admin`.`admin_id` = 6;

Block Web server over loading Bad Crawler Bots and Search Engine Spiders with .htaccess rules

Monday, September 18th, 2017

howto-block-webserver-overloading-bad-crawler-bots-spiders-with-htaccess-modrewrite-rules-file

In last post, I've talked about the problem of Search Index Crawler Robots aggressively crawling websites and how to stop them (the article is here) explaning how to raise delays between Bot URL requests to website and how to completely probhit some bots from crawling with robots.txt.

As explained in article the consequence of too many badly written or agressive behaviour Spider is the "server stoning" and therefore degraded Web Server performance as a cause or even a short time Denial of Service Attack, depending on how well was the initial Server Scaling done.

The bots we want to filter are not to be confused with the legitimate bots, that drives real traffic to your website, just for information

 The 10 Most Popular WebCrawlers Bots as of time of writting are:
 

1. GoogleBot (The Google Crawler bots, funnily bots become less active on Saturday and Sundays :))

2. BingBot (Bing.com Crawler bots)

3. SlurpBot (also famous as Yahoo! Slurp)

4. DuckDuckBot (The dutch search engine duckduckgo.com crawler bots)

5. Baiduspider (The Chineese most famous search engine used as a substitute of Google in China)

6. YandexBot (Russian Yandex Search engine crawler bots used in Russia as a substitute for Google )

7. Sogou Spider (leading Chineese Search Engine launched in 2004)

8. Exabot (A French Search Engine, launched in 2000, crawler for ExaLead Search Engine)

9. FaceBot (Facebook External hit, this crawler is crawling a certain webpage only once the user shares or paste link with video, music, blog whatever  in chat to another user)

10. Alexa Crawler (la_archiver is a web crawler for Amazon's Alexa Internet Rankings, Alexa is a great site to evaluate the approximate page popularity on the internet, Alexa SiteInfo page has historically been the Swift Army knife for anyone wanting to quickly evaluate a webpage approx. ranking while compared to other pages)

Above legitimate bots are known to follow most if not all of W3C – World Wide Web Consorium (W3.Org) standards and therefore, they respect the content commands for allowance or restrictions on a single site as given from robots.txt but unfortunately many of the so called Bad-Bots or Mirroring scripts that are burning your Web Server CPU and Memory mentioned in previous article are either not following /robots.txt prescriptions completely or partially.

Hence with the robots.txt unrespective bots, the case the only way to get rid of most of the webspiders that are just loading your bandwidth and server hardware is to filter / block them is by using Apache's mod_rewrite through

 

.htaccess


file

Create if not existing in the DocumentRoot of your website .htaccess file with whatever text editor, or create it your windows / mac os desktop and transfer via FTP / SecureFTP to server.

I prefer to do it directly on server with vim (text editor)

 

 

vim /var/www/sites/your-domain.com/.htaccess

 

RewriteEngine On

IndexIgnore .htaccess */.??* *~ *# */HEADER* */README* */_vti*

SetEnvIfNoCase User-Agent "^Black Hole” bad_bot
SetEnvIfNoCase User-Agent "^Titan bad_bot
SetEnvIfNoCase User-Agent "^WebStripper" bad_bot
SetEnvIfNoCase User-Agent "^NetMechanic" bad_bot
SetEnvIfNoCase User-Agent "^CherryPicker" bad_bot
SetEnvIfNoCase User-Agent "^EmailCollector" bad_bot
SetEnvIfNoCase User-Agent "^EmailSiphon" bad_bot
SetEnvIfNoCase User-Agent "^WebBandit" bad_bot
SetEnvIfNoCase User-Agent "^EmailWolf" bad_bot
SetEnvIfNoCase User-Agent "^ExtractorPro" bad_bot
SetEnvIfNoCase User-Agent "^CopyRightCheck" bad_bot
SetEnvIfNoCase User-Agent "^Crescent" bad_bot
SetEnvIfNoCase User-Agent "^Wget" bad_bot
SetEnvIfNoCase User-Agent "^SiteSnagger" bad_bot
SetEnvIfNoCase User-Agent "^ProWebWalker" bad_bot
SetEnvIfNoCase User-Agent "^CheeseBot" bad_bot
SetEnvIfNoCase User-Agent "^Teleport" bad_bot
SetEnvIfNoCase User-Agent "^TeleportPro" bad_bot
SetEnvIfNoCase User-Agent "^MIIxpc" bad_bot
SetEnvIfNoCase User-Agent "^Telesoft" bad_bot
SetEnvIfNoCase User-Agent "^Website Quester" bad_bot
SetEnvIfNoCase User-Agent "^WebZip" bad_bot
SetEnvIfNoCase User-Agent "^moget/2.1" bad_bot
SetEnvIfNoCase User-Agent "^WebZip/4.0" bad_bot
SetEnvIfNoCase User-Agent "^WebSauger" bad_bot
SetEnvIfNoCase User-Agent "^WebCopier" bad_bot
SetEnvIfNoCase User-Agent "^NetAnts" bad_bot
SetEnvIfNoCase User-Agent "^Mister PiX" bad_bot
SetEnvIfNoCase User-Agent "^WebAuto" bad_bot
SetEnvIfNoCase User-Agent "^TheNomad" bad_bot
SetEnvIfNoCase User-Agent "^WWW-Collector-E" bad_bot
SetEnvIfNoCase User-Agent "^RMA" bad_bot
SetEnvIfNoCase User-Agent "^libWeb/clsHTTP" bad_bot
SetEnvIfNoCase User-Agent "^asterias" bad_bot
SetEnvIfNoCase User-Agent "^httplib" bad_bot
SetEnvIfNoCase User-Agent "^turingos" bad_bot
SetEnvIfNoCase User-Agent "^spanner" bad_bot
SetEnvIfNoCase User-Agent "^InfoNaviRobot" bad_bot
SetEnvIfNoCase User-Agent "^Harvest/1.5" bad_bot
SetEnvIfNoCase User-Agent "Bullseye/1.0" bad_bot
SetEnvIfNoCase User-Agent "^Mozilla/4.0 (compatible; BullsEye; Windows 95)" bad_bot
SetEnvIfNoCase User-Agent "^Crescent Internet ToolPak HTTP OLE Control v.1.0" bad_bot
SetEnvIfNoCase User-Agent "^CherryPickerSE/1.0" bad_bot
SetEnvIfNoCase User-Agent "^CherryPicker /1.0" bad_bot
SetEnvIfNoCase User-Agent "^WebBandit/3.50" bad_bot
SetEnvIfNoCase User-Agent "^NICErsPRO" bad_bot
SetEnvIfNoCase User-Agent "^Microsoft URL Control – 5.01.4511" bad_bot
SetEnvIfNoCase User-Agent "^DittoSpyder" bad_bot
SetEnvIfNoCase User-Agent "^Foobot" bad_bot
SetEnvIfNoCase User-Agent "^WebmasterWorldForumBot" bad_bot
SetEnvIfNoCase User-Agent "^SpankBot" bad_bot
SetEnvIfNoCase User-Agent "^BotALot" bad_bot
SetEnvIfNoCase User-Agent "^lwp-trivial/1.34" bad_bot
SetEnvIfNoCase User-Agent "^lwp-trivial" bad_bot
SetEnvIfNoCase User-Agent "^Wget/1.6" bad_bot
SetEnvIfNoCase User-Agent "^BunnySlippers" bad_bot
SetEnvIfNoCase User-Agent "^Microsoft URL Control – 6.00.8169" bad_bot
SetEnvIfNoCase User-Agent "^URLy Warning" bad_bot
SetEnvIfNoCase User-Agent "^Wget/1.5.3" bad_bot
SetEnvIfNoCase User-Agent "^LinkWalker" bad_bot
SetEnvIfNoCase User-Agent "^cosmos" bad_bot
SetEnvIfNoCase User-Agent "^moget" bad_bot
SetEnvIfNoCase User-Agent "^hloader" bad_bot
SetEnvIfNoCase User-Agent "^humanlinks" bad_bot
SetEnvIfNoCase User-Agent "^LinkextractorPro" bad_bot
SetEnvIfNoCase User-Agent "^Offline Explorer" bad_bot
SetEnvIfNoCase User-Agent "^Mata Hari" bad_bot
SetEnvIfNoCase User-Agent "^LexiBot" bad_bot
SetEnvIfNoCase User-Agent "^Web Image Collector" bad_bot
SetEnvIfNoCase User-Agent "^The Intraformant" bad_bot
SetEnvIfNoCase User-Agent "^True_Robot/1.0" bad_bot
SetEnvIfNoCase User-Agent "^True_Robot" bad_bot
SetEnvIfNoCase User-Agent "^BlowFish/1.0" bad_bot
SetEnvIfNoCase User-Agent "^JennyBot" bad_bot
SetEnvIfNoCase User-Agent "^MIIxpc/4.2" bad_bot
SetEnvIfNoCase User-Agent "^BuiltBotTough" bad_bot
SetEnvIfNoCase User-Agent "^ProPowerBot/2.14" bad_bot
SetEnvIfNoCase User-Agent "^BackDoorBot/1.0" bad_bot
SetEnvIfNoCase User-Agent "^toCrawl/UrlDispatcher" bad_bot
SetEnvIfNoCase User-Agent "^WebEnhancer" bad_bot
SetEnvIfNoCase User-Agent "^TightTwatBot" bad_bot
SetEnvIfNoCase User-Agent "^suzuran" bad_bot
SetEnvIfNoCase User-Agent "^VCI WebViewer VCI WebViewer Win32" bad_bot
SetEnvIfNoCase User-Agent "^VCI" bad_bot
SetEnvIfNoCase User-Agent "^Szukacz/1.4" bad_bot
SetEnvIfNoCase User-Agent "^QueryN Metasearch" bad_bot
SetEnvIfNoCase User-Agent "^Openfind data gathere" bad_bot
SetEnvIfNoCase User-Agent "^Openfind" bad_bot
SetEnvIfNoCase User-Agent "^Xenu’s Link Sleuth 1.1c" bad_bot
SetEnvIfNoCase User-Agent "^Xenu’s" bad_bot
SetEnvIfNoCase User-Agent "^Zeus" bad_bot
SetEnvIfNoCase User-Agent "^RepoMonkey Bait & Tackle/v1.01" bad_bot
SetEnvIfNoCase User-Agent "^RepoMonkey" bad_bot
SetEnvIfNoCase User-Agent "^Zeus 32297 Webster Pro V2.9 Win32" bad_bot
SetEnvIfNoCase User-Agent "^Webster Pro" bad_bot
SetEnvIfNoCase User-Agent "^EroCrawler" bad_bot
SetEnvIfNoCase User-Agent "^LinkScan/8.1a Unix" bad_bot
SetEnvIfNoCase User-Agent "^Keyword Density/0.9" bad_bot
SetEnvIfNoCase User-Agent "^Kenjin Spider" bad_bot
SetEnvIfNoCase User-Agent "^Cegbfeieh" bad_bot

 

<Limit GET POST>
order allow,deny
allow from all
Deny from env=bad_bot
</Limit>

 


Above rules are Bad bots prohibition rules have RewriteEngine On directive included however for many websites this directive is enabled directly into VirtualHost section for domain/s, if that is your case you might also remove RewriteEngine on from .htaccess and still the prohibition rules of bad bots should continue to work
Above rules are also perfectly suitable wordpress based websites / blogs in case you need to filter out obstructive spiders even though the rules would work on any website domain with mod_rewrite enabled.

Once you have implemented above rules, you will not need to restart Apache, as .htaccess will be read dynamically by each client request to Webserver

2. Testing .htaccess Bad Bots Filtering Works as Expected


In order to test the new Bad Bot filtering configuration is working properly, you have a manual and more complicated way with lynx (text browser), assuming you have shell access to a Linux / BSD / *Nix computer, or you have your own *NIX server / desktop computer running
 

Here is how:
 

 

lynx -useragent="Mozilla/5.0 (compatible; MegaIndex.ru/2.0; +http://megaindex.com/crawler)" -head -dump http://www.your-website-filtering-bad-bots.com/

 

 

Note that lynx will provide a warning such as:

Warning: User-Agent string does not contain "Lynx" or "L_y_n_x"!

Just ignore it and press enter to continue.

Two other use cases with lynx, that I historically used heavily is to pretent with Lynx, you're GoogleBot in order to see how does Google actually see your website?
 

  • Pretend with Lynx You're GoogleBot

 

lynx -useragent="Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)" -head -dump http://www.your-domain.com/

 

 

  • How to Pretend with Lynx Browser You are GoogleBot-Mobile

 

lynx -useragent="Mozilla/5.0 (iPhone; U; CPU iPhone OS 4_1 like Mac OS X; en-us) AppleWebKit/532.9 (KHTML, like Gecko) Version/4.0.5 Mobile/8B117 Safari/6531.22.7 (compatible; Googlebot-Mobile/2.1; +http://www.google.com/bot.html)" -head -dump http://www.your-domain.com/

 


Or for the lazy ones that doesn't have Linux / *Nix at disposal you can use WannaBrowser website

Wannabrowseris a web based browser emulator which gives you the ability to change the User-Agent on each website req1uest, so just set your UserAgent to any bot browser that we just filtered for example set User-Agent to CheeseBot

The .htaccess rule earier added once detecting your browser client is coming in with the prohibit browser agent will immediately filter out and you'll be unable to access the website with a message like:
 

HTTP/1.1 403 Forbidden

 

Just as I've talked a lot about Index Bots, I think it is worthy to also mention three great websites that can give you a lot of Up to Date information on exact Spiders returned user-agent, common known Bot traits as well as a a current updated list with the Bad Bots etc.

Bot and Browser Resources information user-agents, bad-bots and odd Crawlers and Bots specifics

1. botreports.com
2. user-agents.org
3. useragentapi.com

 

An updated list with robots user-agents (crawler-user-agents) is also available in github here regularly updated by Caia Almeido

There are also a third party plugin (modules) available for Website Platforms like WordPress / Joomla / Typo3 etc.

Besides the listed on these websites as well as the known Bad and Good Bots, there are perhaps a hundred of others that might end up crawling your webdsite that might or might not need  to be filtered, therefore before proceeding with any filtering steps, it is generally a good idea to monitor your  HTTPD access.log / error.log, as if you happen to somehow mistakenly filter the wrong bot this might be a reason for Website Indexing Problems.

Hope this article give you some valueable information. Enjoy ! 🙂

 

Briefly unavailable for scheduled maintenance. Fix WordPress after interrupted upgrade

Thursday, March 2nd, 2017

briefly-unavaiable-for-a-scheduled-maintenance-wordpress-website-fix-howto_1

I've recenty tried Update my WordPress blog sites and being unattentive I've selected all the plugins possitble for Upgrade by checking the "Select All" check box on the Update dialogs and almost automatically int he hurry pressing Update button however out of a sudden I've realized I could screw up my websites brutally as some of the plugins to upgraded might be lacking 100% compitability with their prior versions.

I've made a messes out of my blog many times during upgrades because of choosing to upgrade the wrong not 100% compatible plugins and I know well how painful and hard to track it could be a misbehaving incompatible plugin or how ot could cause a severe sluggishness to blog which automatically reflects on how well the website search engine ranked in Google / Yahoo / Bing indexed etc.

Thus as an almost unconcsious reaction to prevent myself the future troubles I've tried to cancel the update request in Firefox browser and trying to reload the Update page with a hope that I might be quick enough for the Apache / WP / MySQLbackend Update Update queries request to be delaying for processing but I was too slow and bang! I ended up with the following unpleasent message in my browser:

briefly-unavaiable-for-a-scheduled-maintenance-wordpress-website-fix-howto
Briefly unavailable for scheduled maintenance.
 

As you could guess that message caused me quite a lot of worries at hand especially since I've already break up my sites many times by doing quick unmindful reactions and the fact that there is Google Adsense ads appearing which does give me some Return on Investment cents every now and then …

It took me few minutes of research online to find what really happened and how to fix / resolve the WebSites normal operations.
 


So what causes the Briefly unavailable for scheduled maintenance. appears ?

When WordPress does some of its integrated maintenance jobs a plugin enable / disable or any task that has to modify crucial configurations inside the database WordPress does disable access to all end clients to itself in order to protect its sensitive data to appear to browser requestors as showing some unexpected information to end client browser could be later used by crackers / hackers or a possibly open a security hole for an attacker.

The message is wordpress generated notice and it is pretty normal for the end user to see it during the WP site installation update depending on how many plugins are installed and loaded to the site and how long it will take for the backend Linux / Windows server to fetch the archived .zips of plugins and substitute with the new ones and update the files extracting them to wp-content/plugins and updating the respective required SQL database / tables it could be showing for end users from few secs to few minutes.

However under some circumstances on Browser request timeout to remote wordpress site due to a network connectivity issue or just a bad configuration of Apache for requests timeout (or a slow remote server Apache responce time due to server Hardware / Mem overload) or a stupid browser "Stop" / cancel request like in my case you end up with the Briefly unavailable for scheduled maintenance and you can can longer access the your https://siteurl.com/wp-admin Admin Panel.

The message is triggered by a WP craeted file .maintenance inside /var/www/blog-site/ e.g. WordPress PHP scripts does check for /var/www/blog-site/.maintenance
existence and if it is matched the WP scripts does generate the Briefly unavailble … message.


How to resolve the "Briefly unavailable for scheduled maintenance. Check back in a minute" WordPress error ?

As you might guess removing the maintenance "coming soon" like message in most of the cases comes to just deleting the .maintenance file, to do so:

1. Login to remote server via FTP or SFTP
2. Locate your WP website root folder that should be something like /var/www/blog-site/.maintenance and issue:
issue something like:
 

$ rm -rf /var/www/bog-site/.maintenance


Assuming that some plugin Update .zip extraction or SQL update query did not ended being half installed / executed that should solve the error.
To check whether all is back to normal just refresh your browser pointing to the "broken" site. If it appears well you can thank God for that 🙂
If not check the apache error logs and php error logs and see which of the php scripts is failing and then try to manually fetch and unzip the WP .zip package to wp-content/plugins folder and give it another try and if God bless so it will work as before 🙂

How to prevent your WP based business in future from such nasty errors using A Staging site (test) version of your blog ?

Just run a duplicate of your website under a separate folder on your hosting and do enable the same plugins as on the primary website and copy over the MySQL / PostgreSQL Database from your Live site to the Staging, then once it is enabled before doing any crucial WordPress version updates or Plugins Update always do try the Upgrade first on your Test Staging site. If it does execute fine there in most of the cases the result should be the same on the Production host and that could solve you effors and nerves of debugging a hard to get failure errors or faulty plugins without affecting what your End users see.

If you're not hosting the WordPress install under your own hosting like me you can always use some of the public available hostings like  BlueHost or WPEngine

 

How to show country flag, web browser type and Operating System in WordPress Comments

Wednesday, February 15th, 2012

!!! IMPORTANT UPDATE COMMENT INFO DETECTOR IS NO LONGER SUPPORTED (IS OBSOLETE) AND THE COUNTRY FLAGS AND OPERATING SYSTEM WILL BE NOT SHOWING INSTEAD,

!!!! TO MAKE THE COUNTRY FLAGS AND OS WP FUNCTIONALITY WORK AGAIN YOU WILL NEED TO INSTALL WP-USERAGENT !!!

I've come across a nice WordPress plugin that displays country flag, operating system and web browser used in each of posted comments blog comments.
Its really nice plugin, since it adds some transperancy and colorfulness to each of blog comments 😉
here is a screenshot of my blog with Comments Info Detector "in action":

Example of Comments Info Detector in Action on wordpress blog comments

Comments Info Detector as of time of writting is at stable ver 1.0.5.
The plugin installation and configuration is very easy as with most other WP plugins. To install the plugin;

1. Download and unzip Comments Info Detector

linux:/var/www/blog:# cd wp-content/plugins
linux:/var/www/blog/wp-content/plugins:# wget http://downloads.wordpress.org/plugin/comment-info-detector.zip
...
linux:/var/www/blog/wp-content/plugins:# unzip comment-info-detector.zip
...

Just for the sake of preservation of history, I've made a mirror of comments-info-detector 1.0.5 wp plugin for download here
2. Activate Comment-Info-Detector

To enable the plugin Navigate to;
Plugins -> Inactive -> Comment Info Detector (Activate)

After having enabled the plugin as a last 3rd step it has to be configured.

3. Configure comment-info-detector wp plugin

By default the plugin is disabled. To change it to enabled (configure it) by navigating to:

Settings -> Comments Info Detector

Next a a page will appear with variout fields and web forms, where stuff can be changed. Here almost all of it should be left as it is the only change should be in the drop down menus near the end of the page:

Display Country Flags Automatically (Change No to Yes)
Display Web Browsers and OS Automatically (Change No to Yes

Comments Info Detector WordPress plugin configuration Screenshot

After the two menus are set to "Yes" and pressing on Save Changes the plugin is enabled it will immediately start showing information inside each comment the GeoIP country location flag of the person who commented as well as OS type and Web Browser 🙂

Tools to scan a Linux / Unix Web server for Malware and Rootkits / Lynis and ISPProtect – clean Joomla / WordPress and other CMS for malware and malicious scripts and trojan codes

Monday, March 14th, 2016

Linux-BSD-Unix-Rootkit-Malware-XSS-Injection-spammer-scripts-clean-howto-manual

If you have been hacked or have been suspicious that someone has broken up in some of the shared web hosting servers you happent o manage you already probably have tried the server with rkhuter, chroot and unhide tools which gives a general guidance where a server has been compromised

However with the evolution of hacking tools out there and the boom of Web security XSS / CSS / Database injections and PHP scripts vulnerability catching an intruder especially spammers has been becoming more and more hard to achieve.

Just lately a mail server of mine's load avarage increased about 10 times, and the CPU's and HDD I/O load jump over the sky.
I started evaluating the situation to find out what exactly went wrong with the machine, starting with a hardware analysis tools and a physical check up whether all was fine with the hardware Disks / Ram etc. just to find out the machine's hardware was working perfect.
I've also thoroughfully investigated on Logs of Apache, MySQL, TinyProxy and Tor server and bind DNS and DJBDns  which were happily living there for quite some time but didn't found anything strange.

Not on a last place I investigated TOP processes (with top command) and iostat  and realized the CPU high burst lays in exessive Input / Output of Hard Drive. Checking the Qmail Mail server logs and the queue with qmail-qstat was a real surprise for me as on the queue there were about 9800 emails hanging unsent, most of which were obviously a spam, so I realized someone was heavily spamming through the server and started more thoroughfully investigating ending up to a WordPress Blog temp folder (writtable by all system users) which was existing under a Joomla directory infrastructure, so I guess someone got hacked through the Joomla and uploaded the malicious php spammer script to the WordPress blog. I've instantly stopped and first chmod 000 to stop being execuded and after examing deleted view73.php, javascript92.php and index8239.php which were full of PHP values with binary encoded values and one was full of encoded strings which after being decoding were actually the recepient's spammed emails.
BTW, the view*.php javascript*.php and index*.php files were owned by www-data (the user with which Apache was owned), so obviously someone got hacked through some vulnerable joomla or wordpress script (as joomla there was quite obscure version 1.5 – where currently Joomla is at version branch 3.5), hence my guess is the spamming script was uploaded through Joomla XSS vulnerability).

As I was unsure wheteher the scripts were not also mirrored under other subdirectories of Joomla or WP Blog I had to scan further to check whether there are no other scripts infected with malware or trojan spammer codes, webshells, rootkits etc.
And after some investigation, I've actually caught the 3 scripts being mirrored under other webside folders with other numbering on filename view34.php javascript72.php, index8123.php  etc..

I've used 2 tools to scan and catch malware the trojan scripts and make sure no common rootkit is installed on the server.

1. Lynis (to check for rootkits)
2. ISPProtect (Proprietary but superb Website malware scanner with a free trial)

1. Lynis – Universal security auditing tool and rootkit scanner

Lynis is actually the well known rkhunter, I've used earlier to check servers BSD and Linux servers for rootkits.
To have up-to-date version of Lynis, I've installed it from source:
 

cd /tmp
wget https://cisofy.com/files/lynis-2.1.1.tar.gz
tar xvfz lynis-2.1.1.tar.gz
mv lynis /usr/local/
ln -s /usr/local/lynis/lynis /usr/local/bin/lynis

 


Then to scan the server for rootkits, first I had to update its malware definition database with:
 

lynis update info


Then to actually scan the system:
 

lynis audit system


Plenty of things will be scanned but you will be asked on a multiple times whether you would like to conduct different kind fo system services and log files, loadable kernel module rootkits and  common places to check for installed rootkits or server placed backdoors. That's pretty annoying as you will have to press Enter on a multiple times.

lynis-asking-to-scan-for-rootkits-backdoors-and-malware-your-linux-freebsd-netbsd-unix-server

Once scan is over you will get a System Scan Summary like in below screenshot:

lynis-scanned-server-for-rootkit-summer-results-linux-check-for-backdoors-tool

Lynis suggests also a very good things that might be tampered to make the system more secure, so using some of its output when I have time I'll work out on hardening all servers.

To prevent further incidents and keep an eye on servers I've deployed Lynis scan via cron job once a month on all servers, I've placed under a root cronjob on every first dae of month in following command:

 

 

server:~# crontab -u root -e
0 3 1 * * /usr/local/bin/lynis –quick 2>&1 | mail -s "lynis output of my server" admin-mail@my-domain.com)

 

2. ISPProtect – Website malware scanner

ISPProtect is a malware scanner for web servers, I've used it to scan all installed  CMS systems like WordPress, Joomla, Drupal etc.
ISPProtect is great for PHP / Pyhon / Perl and other CMS based frameworks.
ISPProtect contains 3 scanning engines: a signature based malware scanner, a heuristic malware scanner, and a scanner to show the installation directories of outdated CMS systems.
Unfortunately it is not free software, but I personally used the FREE TRIAL option  which can be used without registration to test it or clean an infected system.
I first webserver first locally for the infected site and then globally for all the other shared hosting websites.

As I wanted to check also rest of hosted websites, I've run ISPProtect over the all bunch of installed websites.
Pre-requirement of ISPProtect is to have a working PHP Cli and Clamav Anti-Virus installed on the server thus on RHEL (RPM) based servers make sure you have it installed if not:
 

server:~# yum -y install php

server:~# yum -y install clamav


Debian based Linux servers web hosting  admins that doesn't have php-cli installed should run:
 

server:~# apt-get install php5-cli

server:~# apt-get install clamav


Installing ISPProtect from source is with:

mkdir -p /usr/local/ispprotect
chown -R root:root /usr/local/ispprotect
chmod -R 750 /usr/local/ispprotect
cd /usr/local/ispprotect
wget http://www.ispprotect.com/download/ispp_scan.tar.gz
tar xzf ispp_scan.tar.gz
rm -f ispp_scan.tar.gz
ln -s /usr/local/ispprotect/ispp_scan /usr/local/bin/ispp_scan

 

To initiate scan with ISPProtect just invoke it:
 

server:~# /usr/local/bin/ispp_scan

 

ispprotect-scan-websites-for-malware-and-infected-with-backdoors-or-spamming-software-source-code-files

I've used it as a trial

Please enter scan key:  trial
Please enter path to scan: /var/www

You will be shown the scan progress, be patient because on a multiple shared hosting servers with few hundred of websites.
The tool will take really, really long so you might need to leave it for 1 hr or even more depending on how many source files / CSS / Javascript etc. needs to be scanned.

Once scan is completed scan and infections found logs will be stored under /usr/local/ispprotect, under separate files for different Website Engines and CMSes:

After the scan is completed, you will find the results also in the following files:
 

Malware => /usr/local/ispprotect/found_malware_20161401174626.txt
Wordpress => /usr/local/ispprotect/software_wordpress_20161401174626.txt
Joomla => /usr/local/ispprotect/software_joomla_20161401174626.txt
Drupal => /usr/local/ispprotect/software_drupal_20161401174626.txt
Mediawiki => /usr/local/ispprotect/software_mediawiki_20161401174626.txt
Contao => /usr/local/ispprotect/software_contao_20161401174626.txt
Magentocommerce => /usr/local/ispprotect/software_magentocommerce_20161401174626.txt
Woltlab Burning Board => /usr/local/ispprotect/software_woltlab_burning_board_20161401174626.txt
Cms Made Simple => /usr/local/ispprotect/software_cms_made_simple_20161401174626.txt
Phpmyadmin => /usr/local/ispprotect/software_phpmyadmin_20161401174626.txt
Typo3 => /usr/local/ispprotect/software_typo3_20161401174626.txt
Roundcube => /usr/local/ispprotect/software_roundcube_20161401174626.txt


ISPProtect is really good in results is definitely the best malicious scripts / trojan / trojan / webshell / backdoor / spammer (hacking) scripts tool available so if your company could afford it you better buy a license and settle a periodic cron job scan of all your servers, like lets say:

 

server:~# crontab -u root -e
0 3  1 * *   /usr/local/ispprotect/ispp_scan –update && /usr/local/ispprotect/ispp_scan –path=/var/www –email-results=admin-email@your-domain.com –non-interactive –scan-key=AAA-BBB-CCC-DDD


Unfortunately ispprotect is quite expensive so I guess most small and middle sized shared hosting companies will be unable to afford it.
But even for a one time run this tools worths the try and will save you an hours if not days of system investigations.
I'll be glad to hear from readers if aware of any available free software alternatives to ISPProtect. The only one I am aware is Linux Malware Detect (LMD).
I've used LMD in the past but as of time of writting this article it doesn't seems working any more so I guess the tool is currently unsupported / obsolete.

 

Removing exim and installing qmail / Generate and install pseudo mta dummy package on Debian / Ubuntu etc. .deb based Linux

Thursday, March 10th, 2016

debian-dummy-mta-package-install-howto-tux-mail-nice-mascot
If you happen to be installing Qmail Mail server on a Debian or Ubuntu (.deb) based Linux, you will notice by default there will be some kind of MTA (Mail Transport Agent) already installed mail-transfer-agent package will be installed and because of Debian .deb package depedency to have an MTA always installed on the system you will be unable to remove Exim MTA without installing some other MTA (Postix / Qmail) etc.

This will be a problem for those like me who prefer to compile and install Qmail from source, thus to get around this it is necessery to create a dummy package that will trick the deb packaging depencies that actually mta-local MTA package is present on the server.

The way to go here is to use equivs (Circumvent debian package dependencies):
 

debian:~# apt-cache show equivs|grep -i desc -A 10

Description: Circumvent Debian package dependencies
 This package provides a tool to create trivial Debian packages.
 Typically these packages contain only dependency information, but they
 can also include normal installed files like other packages do.
 .
 One use for this is to create a metapackage: a package whose sole
 purpose is to declare dependencies and conflicts on other packages so
 that these will be automatically installed, upgraded, or removed.
 .
 Another use is to circumvent dependency checking: by letting dpkg
 think a particular package name and version is installed when it

Btw creating a .deb dummy package will be necessery in many other cases when you have to install from some third party debian repositories or some old and alrady unmaintaned deb-src packages for the sake of making some archaic software to resurrect somewhere, so sooner or later even if you're not into Mail servers you will certainly need equivs.

Then install equivs and go on proceeding creating the dummy mail-transport-agent package
 

debian:~# cd /tmp debian:~# cp -rpf /usr/share/doc/equivs/examples/mail-transport-agent.ctl . debian:~# equivs-build mail-transport-agent.ctl


Above command will build and package /tmp/mta-local_1.0_all.deb dummy package.
So continue and install it with dpkg as you use to install debian packages
 

 

debian:~# dpkg -i /tmp/mta-local_1.0_all.deb


From then on you can continue your standard LWQ – Life with Qmail or any other source based qmail installation with:

 

 

./config-fast mail.yourmaildomain.net


So that's it now .deb packaging system consistency will be complete so standard security package updates with apt-get and aptitude updates or dpkg -i third party custom software insatlls will not be breaking up any more.

Hope that helped someone 🙂

 

 

 

 

Use mac PC built-in camera to make / take pictures on Mac OS X macbookair notebook with Photo Booth

Friday, February 26th, 2016

take-picture-or-video-with-built-in-camera-on-MacOSX-PhotoboothLOGO

It seems we lost our Good high quality Digital Camera somewhere and I was in need urgently to make a good quality photo (my ZTE Phone) has a very bad camera, so I got the idea to use Macbookair's camera as it has better
resolution to picture my present a  Tank Tort 🙂 hand made by my wife as a present for the Day of the Defender of the Fatherland which is a major feast in Russia, Belarus and many of the ex-Soviet Union members communist countries.

Actually using build in camare in MacBookAir is a handy thing for people mising at the moment a good high quality digital camera as it is thin and light and build in MacBook cam can be used to make Videos and Pictures exactly the same way
as an ordinary Tablet Computer is used so commonly nowadays by many:

In other words I needed for the Mac OS X equivalent to Cheese's (Photo and Video) capturer program for Linux.

Luckily for people interested in using their Mac OS notebook as a amateur camera this is easy by using default shipped Mac Application called:

Photo Booth app

To Launch Photo Booth app it just look it up in Finder and double click it:

PhotoBooth-how-to-take-photos-on-macosx

Clicking the large red button underneath the preview area will take a picture after an optional countdown.

Tort Tank of Svetka

Besides being able to capture Video and Pictures from Mac's camera it could add also some nice effects to taken pictures and videos (supports a basic video editing) features and effects.

The effects you can choose are are: Sepia, Black and White, Glow, Comic Book, Normal, Colored Pencil, Thermal Camera, X-Ray, and Pop Art. There are also effects that change the person in the picture using these effects: Bulge, Dent, Twirl, Squeeze, Mirror, Light Tunnel, Fish Eye, and Stretch. Actually  photographic filters of Photo Booth are very similar to Adobe Photoshop.

make-picture-on-mac-photobooth-effects-screenshot
By default Photo Booth will create picture, howver

Photo Booth saves your photos as JPEG files in a folder named Photo Booth, located in your home folder.

Choose File > Reveal in Finder

to see your picture files.

A much better way to be able to easily see and access all taken Pictures and Videos with Photo Booth is to

 

open Terminal

 

and type:

——-
 

 

$ cd Pictures
$ ln -s Photo\ Booth\ Library/Pictures/ PhotoBoothPics


This will make Link to pictures be easily accessible from your Finder -> Pictures folder
 
Applying custom photo backgrounds

A very useful feature of Photo Booth is that the user can apply backdrops to provide an effect similar to a green screen. When a backdrop is selected, a message appears telling the user to step away from the camera. Once the background is analyzed, the user steps back in front of the camera and is shown in front of the chosen backdrop.

For people who prefer to take photos using a console program on Mac OS I guess you should take a look at ffpeg
Here is one more snapshot of the Tort Tank snapshot made with the Macbookair of Svetlana 🙂

Tort Tank


P.S. If you like the Tort Tank and you happen to live in Sofia Bulgaria, you can order it  by dropping me a comment with request 🙂

Enjoy ! 🙂