Category Archives: Webhosting

How to check the current PHP handler and change it

[root@server ~]# /usr/local/cpanel/bin/rebuild_phpconf –current
Available handlers: suphp dso cgi none
PHP4 SAPI: none
PHP5 SAPI: cgi
SUEXEC: not installed

[root@server ~]# /usr/local/cpanel/bin/rebuild_phpconf 4 dso none 1

[root@server ~]# /usr/local/cpanel/bin/rebuild_phpconf –current
Available handlers: suphp dso cgi none
PHP4 SAPI: dso
PHP5 SAPI: none
SUEXEC: enabled

[root@server ~]# /usr/local/cpanel/bin/rebuild_phpconf –help
/usr/local/cpanel/bin/rebuild_phpconf [–dryrun] [–no-restart]
–dryrun : Only display the changes that would be made
–no-restart : Don’t restart Apache after updating the php.conf link
–no-htaccess : Don’t update user configurable PHP mime mapping.
–current : Show current settings
–available : Show available handlers and PHP SAPIs
: Version of PHP to set as default handler for .php files
: Type of Apache module to use in serving PHP requests
: enabled, disabled, 1 or 0

The Five Types of PHP Configuration That Are Possible:-
* None – Don’t provide access to this version of PHP
*DSO – Provide this version of PHP via or ( mod_php). This is normally the fastest possible way to serve PHP requests, but PHP will execute as the user “nobody”. If both versions of PHP are available, it is impossible to configure both to be served as DSO unless the concurrent DSO patch was applied at build time.
* SuPHP – Provide this version of PHP through mod_suphp. This is the most flexible way of serving PHP requests and tends to be very secure. PHP scripts are executed by the user who owns the VirtualHost serving the request.
* FCGI – Provide this version of PHP through mod_fcgid. This is a very fast way of serving PHP requests, but php.conf will most likely require additional tuning to perform well. If Suexec is enabled, each user will create their own PHP FastCGI server automatically and PHP scripts will be executed by the user who owns the VirtualHost serving the request. If Suexec is disabled, the “nobody” user will own all of the PHP FastCGI server processes and PHP scripts will be executed by the “nobody” user. FCGI mode is recommended only for advanced administrators who understand how to tune the performance of mod_fcgid. Userdir requests will not function correctly with the basic mod_fcgid setup provided by cPanel.
* CGI – Provide this version of PHP through mod_cgi or mod_cgid. If Suexec is enabled, PHP scripts will be executed by the user who owns the VirtualHost serving the request. If Suexec is disabled, the “nobody” user will execute all PHP scripts. Userdir requrests will not function correctly with the basic CGI setup provided by cPanel. It is intended as a fallback when the other preferred methods (DSO or SuPHP) are not available. Serving PHP as CGI is not particularly secure or fast regardless of whether Suexec is enabled.


Reliable Shared Hosting

Magento error: Your web server is configured incorrectly. As a result, configuration files with sensitive information are accessible from the outside. Please contact your hosting provider.

Magento error:
Your web server is configured incorrectly. As a result, configuration files with sensitive information are accessible from the outside. Please contact your hosting provider.

This error is caused by /app/etc/local.xml being readable by the world. Set the permissions to 551

Serve files with no extension as PHP

So I recently wget mirrored a whole website and uploaded them to my web server. The PHP files from the original server had no extension and when trying to load them, Apache did not serve them as PHP, loading a bunch of jibberish.

To solve this problem, I had to edit .htaccess :

< FilesMatch '^[^.]+$' >
ForceType application/x-httpd-php
< /FilesMatch >


#AddHandler server-parsed .php

#SetHandler application/x-httpd-php

#AddHandler application/x-httpd-php .php


#RewriteEngine On
#RewriteRule ^[^.]+$ – [T=application/x-httpd-php,L]

Offline Browsing in Linux: wget and some tricks

Ever since I joined, I’ve been learning a lot of Linux in the hopes that I switch my career into Linux. Hopefully Forensics related.

So this new dilemma I had was to download a website for offline browsing. I went on the hunt for an offline file browser for Linux…. I found that I could use wget to mirror a whole website.

For example, I want to make a copy of, Here’s how:

wget -m

Here the -m option is telling wget to mirror the website. This is the basic command. But say I need some advanced options. What do I do?

I was trying to get all the script files off of a website to save for later learning and all it was downloading was the index.html and robots.txt
The robots.txt file was blocking user agent wget. To confirm this I used the debug option in wget:

wget -m -d

You’ll get something like:

Not following because robots.txt forbids it.


Rejecting path sh/eg/ because of rule `sh’


no-follow in index.html

I tried using the option –user-agent “Mozilla” ….. no luck

I tried adding the following in .wgetrc :

## Local settings (for a user to set in his $HOME/.wgetrc). It is
## *highly* undesirable to put these settings in the global file, since
## they are potentially dangerous to “normal” users.
## Even when setting up your own ~/.wgetrc, you should know what you
## are doing before doing so.

header = Accept-Language: en-us,en;q=0.5
header = Accept: text/xml,application/xml,application/xhtml+xml,text/html;q=0.9,text/plain;q=0.8,image/png,*/*;q=0.5
header = Accept-Encoding: gzip,deflate
header = Accept-Charset: ISO-8859-1,utf-8;q=0.7,*;q=0.7
header = Keep-Alive: 300
user_agent = Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv: Gecko/20070725 Firefox/
referer =

…Still no luck.

The trick is to use option -e robots=off

So my new command became:

wget -m -k -e robots=off -w 2 --random-wait -U "Mozilla" -np

Heres what the options do:

-m mirrors website
-k fix links so you don’t get directed to instead of /sh/eg (relative vs absolute)
-e executes command robots=off
-w 2 sets wait time as 2 seconds so you don’t overload server and get ip blocked
–random-wait can be random in 2 secon increments
-U sets user agent
-np no parent, so if the current subdirectory/page links to parent pages, it might crawl whole website

WordPress Hosting Review

Free WordPress Hosting for 3 months with Code: FREE3

They provide 1GB space and 10GB Bandwidth. Reliable Shared Hosting with an easy 1-click wordpress install setup with Softaculous. They are a linux based hosting provider, most probably CentOs.The customer support is 24/7 and excellent. Check them out.

I was wondering, does an Android OS based webhosting exist? I haven’t seen one yet but it would be interesting if someone starts that. So that would be like running an AAMP server, right?

keywords: Free WordPress Host, Reliable Linux Webhosting, Shared Hosting Plan, WordPress Hosting Review, WordPress with Fantastico

Automatic website backup without SSH enabled over FTP

Last time I backed up my website with rsync and ssh, but on my new host they disabled SSH. Rsync does not work over ftp. I do not want to do incremental backups with delta files like is done with rdiff-backup or duplicity. I need to have an exact mirror of my site. But remember that your sql databases won’t be backed up.

Curlftpfs is the key! In this tutorial I will show you how to backup from one server to a backup location which can be your hard drive, another web host, dropbox folder, via webdav, etc.

Download curlftpfs, rsync and ncftp:
sudo apt-get install curlftpfs ncftp rsync

make directories to mount your ftp server:
sudo mkdir /media/hydtechblog
sudo mkdir /media/hydtechbackupserver

edit fstab to mount the ftp servers using curlftpfs:
sudo gedit /etc/fstab

add the lines and modify them according to your server: /media/hydtechblog fuse rw,allow_other,uid=root 0 0
curlftpfs#username:password@hydtechbackupserver /media/hydtechbackupserver fuse rw,allow_other,uid=root 0 0

One thing to remember is that these two will not mount automatically because when the computer restarts, the fstab is done while you are not connected to the network. To fix this we can just add the mount commands in our crontab.

Edit crontab:
sudo crontab -e
enter the following lines and modify accordingly:
00 09 * * * mount /media/hydtechblog
00 09 * * * mount /media/hydtechbackup
01 09 * * * rsync -avz –rsync-path=/usr/bin/rsync /media/hydtechblog/public_html /media/hydtechbackup/public_html

ctrl + o to write and ctrl + x to save

This will tell cron to mount the folders at 9:00 am and start rsync at 9:01 am. You can replace the backup location to another folder on your hard drive or your dropbox or ubuntu one folder. You can also mount with webdav and use this method.

For encrypted incremental backups checkout duplicity, it also works with webdav and ftp.

Backing up my wordpress blog and website using Ubuntu

I’ve been noticing that my webhost server keeps going down for a few hours every day and it scares me that I’ll lose all my data. So, I started looking for automatic backup solutions and this is the best way I could come up with.

Backing up the wordpress database:

I’ve tried the following plugins

This plugin backs up my data to their server automatically. only backs up the database though. must register for a free account at The backups occur once every few days.

2. Bei-fen
This plugin backs up my data including images and files to a location on my server

3. DBC Backup
Does a cron backup automatically at any location on my server at any time interval I set it for. Only backs up the database.

4. WP-DB-Backup
Can schedule the database to automatically backup to your server or automatically email them.

5. SMEStorage Backup
Based on WP-DB-Backup. Must register for a free account at and you can backup your data to a cloud storage service like Amazon S3 or You can even have the backups sent to your email.

Backing up my whole website from my webhost to my computer:

I use a tool call rsync in linux to automatically sync my public_html directory on my webserver to a backup folder on my Computer which is synced automatically with Dropbox. You can also use Ubuntu One. For this tutorial your web host must have ssh enabled. If you can’t get ssh, then backup your wordpress website over ftp with curlftpfs on Linux.

Follow these steps:

1. Sign up for an Ubuntu One account or a DropBox account and download/install the desktop client. You can get a 2GB account for free.

2. Download the necessary files
sudo apt-get rsync ssh

3. Set up autologin with ssh so you won’t have to enter your password each time.
sudo ssh-keygen -t dsa
press enter each time without changing anything. this will make a key in you .ssh folder.
copy the ssh key to your server using scp
scp /home/user/.ssh/
login to your server using ssh
enter password and append the key to authorized_keys
cat >> .ssh/authorized_keys
remove the key from the home directory on your server

4. Set up a cron job to sync the public_html folder to your dropbox folder
crontab -e (do not sudo)
open with editor like nano
enter something similar to the following line
* */5 * * * rsync -avz --rsync-path=/usr/bin/rsync -e ssh /media/sdawhatever/locationof/dropbox/backup/
(this is telling cron to sync every 5 hours, for more help with cron check wiki)
press ctrl+O to save file, enter to save, ctrl + X to exit.

thats it, your done!