pfSense DNS Configuration

I have been running a home lab for quite some time now. Being a software engineer, I wanted to have self-hosted state of the art tools both for continuous learning and for my own personal side projects. One of the things I lacked, however was a sufficient security posture.

To solve this problem, I have chosen to utilize pfSense. pfSense is an open source security software which provides firewall, intrusion detection, intrusion prevention, and many other features. Its setup is relatively straightforward, but I did have some issues that I needed to work through regardin DNS.

I chose to run pfSense in a small form factor fanless appliance. Installation of the software required downloading the pfSense image to a USB drive and then following through the on-screen setup. I configured the Wide Area Network (WAN) and Local Area Network (LAN) interfaces as needed and then replaced my old router with the appliance.

I have other DNS servers running on my network. I have an ad-blocking DNS and a DNS which runs in my Windows Server instance. The idea was to have the pfSense DNS Resolver use the ad-blocker as its upstream server. Then use the Windows instance for upstream to the ad blocker. And finally, use Google’s public DNS as the upstream for the Windows instance.

This configuration just did not want to work. I continuously received timeouts when querying sites which I knew were available. At first, I thought it may have been the appliance because I could replace it with my original router and all would work fine.

Ad Blocker Dashboard

As you can see from the image above, I was able to get everything working the way I wanted, but I did have to make some minor changes. I reversed the order of my DNS servers. In the DHCP setting of my appliance, I set the ad blocking DNS as the root DNS for my network. The ad blocking DNS uses the Windows server as its upstream and the Windows server uses pfSense as its upstream. Finally, the appliance uses Google’s public DNS for its upstream. In this configuration, as client devices renew DHCP leases they pick up the ad blocker as their DNS and queries are routed properly. This gives me the DNS protection I want and also allows proper queries for network.

Piwigo an Open Source Photo Gallery

From their website, piwigi.org, Piwigo is an open source photo gallery software for the web which is designed for organizations, teams, and individuals. This article details the installation process for Piwigo on CentOS 7.

1. Install the LAMP Stack and Dependencies

The Linux, Apache, MySQL/MariaDB, and PHP stack is a basic stack which enables serving PHP based web applications. For CentOS, the database is MariaDB. For our installation, we want to install the latest version of PHP so we will install the epel repository, add the remi repository for CentOS 7, and disable any versions of PHP less than version 7.

# yum install epel-release
# yum install http://rpms.remirepo.net/enterprise/remi-release-7.rpm
# ls -l /etc/yum.repos.d/
# vi /etc/yum.repos.d/remi-php54.repo (enabled=0)
# vi /etc/yum.repos.d/remi-php72.repo (enabled=1)

The following command will install everything necessary for Piwigo including the LAMP stack and other dependencies.

# yum install httpd mariadb mariadb-server php php-mysqli php-gd php-fpm php-devel php-pear gcc ImageMagick ImageMagick-devel unzip wget rsync pwgen curl

2. Enable Apache and MariaDB

After installing the dependencies, you need to start and enable Apache as well as MariaDB.

# systemctl start httpd mariadb
# systemctl enable httpd mariadb

3. Create Piwigo Database and Database User

Next, configure the root password for MariaDB and create the database and user for Piwigo. In the scripts below <password> is changed to the password of your choice.

# mysql_secure_installation
# mysql -u root -p
> CREATE DATABASE piwigo_db;
> CREATE USER 'piwigo_user'@'localhost' IDENTIFIED BY '<password>';
> GRANT ALL PRIVILEGES ON piwigo_db.* TO 'piwigo_user'@'localhost';
> FLUSH PRIVILEGES;
> EXIT;

4. Install Piwigo

Now we can obtain the latest version of Piwigo using curl and then move it into the Apache web folder.

# curl http://piwigo.org/download/dlcounter.php?code=latest -o piwigo.zip
# unzip piwigo.zip
# mv piwigo /var/www/html
# chown apache. -R /var/www/html/

5. Make Changes for SELinux

You can handle SELinux in two ways. First you can simply just disable SELinix. A better way would be to have SELinux allow the files in our Piwigo directory.

# cd /var/www/html/piwigo/
# chcon -vR --type=httpd_sys_rw_content_t .

6. Completing the Web Installation

After all of this, you simply navigate to the root of your web installation (http://ip-address) and enter the credentials for your database and click Start installation.

7. Modify Time Zone

There are issues that may arise with the timezone if it is not set in the php.ini file. To correct any of these problems modify the date.timezone parameter in the file with the appropriate value.

# vi /etc/php.ini

In this article, I covered the basics of installing and configuring the open source image gallery Piwigo. The installation here covers an instance without SSL. In a future article, I will discuss the addition of SSL using reverse proxy and the changes that are necessary to keep Piwigo operating properly.

OpenVAS on CentOS 7

What Is OpenVAS

From the OpenVAS website, “OpenVAS is a framework of several services and tools offering a comprehensive and powerful vulnerability scanning and vulnerability management solution. The framework is part of Greenbone Networks‘ commercial vulnerability management solution from which developments are contributed to the Open Source community since 2009.” I personally, want to utilize this tool for performing a vulnerability test of my personal network. As a software engineer, I have implemented a relatively extensive home network. Because these days security is paramount, I felt as though I needed verification my network is as secure as possible.

For testing my network , the idea is installing the OpenVAS vulnerability scanner on an external network and then testing my network by scanning a site hosted within. The first step is installing the scanner. This would be accomplished by installing it on a CentOS 7 virtual machine (VM) hosted within the Linode cloud. CentOS was the chosen flavor of Linux due to its closeness to Red Hat Enterprise Linux (RHEL). Since RHEL is used in many corporate environments, I wanted to install the scanner on a similar operating system.

Installation and Verification of OpenVAS

For the purposes of this test, the VM was not secured and only has a root account. A minimal installation of CentOS was performed by deploying the CentOS 7 image within Linode. Once the operating system was installed, the following steps were carried out installing the OpenVAS vulnerability scanner.

1) yum update -y
2) yum install wget -y
3) wget -q -O - http://www.atomicorp.com/installers/atomic |sh
4) wget http://dl.fedoraproject.org/pub/epel/7/x86_64/e/epel-release-7-10.noarch.rpm
5) rpm -ivh epel-release-7-10.noarch.rpm
6) yum install xalan-c
7) yum install openvas
8) yum install bzip2
9) openvas-setup
10) Accept default rsync
11) Enter username and password
12) echo "unixsocket /tmp/redis.sock" >> /etc/redis.conf
13) sed -i 's/enforcing/disabled/g' /etc/selinux/config /etc/selinux/config
14) systemctl enable redis.service
15) shutdown -r now

Once the installation is complete and the VM has rebooted you can verify the installation by accessing the following URL:

https://<IP Address of VM>:9392

The certificate is not trusted so you will have to create an exception within your browser and you will log in with the username and password created during installation. The next step is verifying the installation against some known vulnerable website.  The site, 15 Vulnerable Sites To (Legally) Practice Your Hacking Skills , contained various sites listed as having vulnerabilities which you could practice your hacking skills. I chose to use the site named, Hack This Site (www.hackthissite.orgas the scanning target. The task wizard was used for starting the scan using the fully qualified domain name (FQDN) as the scanning target.

Initially, I thought the scanner may have had some issues because the status seemed stuck at 1% and after performing some searches on the web this seemed to be a problem which people have asked about. However, I found that I just needed to give the scanner more time for it to increase the percent complete. The scan of the test site took 7 hours 51 minutes and 12 seconds to complete and a total of 43 vulnerabilities were identified.

Results by Severity for www.hackthissite.com

Scanning A Personal Target

Now that I know the scanner is able to scan a target, I decided to point the scanner at a domain which I maintain. One of the software development tools I use is the Redmine issue tracker. My tracker is located at redmine.theparhams.net. So, again, I used the wizard and attempted a scan using the FQDN of my site. This time, the scan only took 21 seconds to complete. However, no vulnerabilities were reported and OpenVAS reported my site may be dead.

Scan Report of My Tracking Site

I performed some research about this finding where OpenVAS says the site may be dead and have found other users also reporting this. Because the site is being scanned from an outside network, I am going to assume the scanner couldn’t reach my site. Because of the techniques involved in scanning the target, my ISP may be blocking the scanning altogether. If so, this is a good thing because adversarial scanning would be getting blocked. For now, I am going to assume the site is relatively secure, but additional steps should be made to verify.

Scan Report Showing Findings and Severity

Additional Steps for Security and Sanity

Because the OpenVAS utility is new to me, I want to take additional steps ensuring accurate results are achieved. In my research on OpenVAS, I found other users reporting dead sites who said these problems were resolved with OpenVAS 9. It was not until I found these posts, that I realized I was using an older version of OpenVAS. I had installed OpenVAS 7 on my external network. That being said, the next steps will be performing a version 9 installation and verifying its functionality against the same two sites and comparing the results.

I hope you have enjoyed reading this post and I look forward to any comments you may provide. If you have had some of these same experiences please let me know. If you have found this content helpful, again, please let me know. I welcome any and all feedback!

Using Non-Standard SSH Port With Gitolite

If you are running a development network from home you more than likely are using some sort of version control. Git has become very popular in recent years and Gitolite along with Git is what I prefer.

The standard port for SSH is port 22. However, in a home network with multiple VMs it may become necessary to use a different external port. Through port forwarding, it becomes possible to access your home version control from the outside world.

First, choose the port you want to utilize for external access. In your router, set up a forwarding rule which forwards the desired port to the appropriate VM. In the VM you will also need to set up a forwarding rule. For CentOS, you need to use the firewall-cmd command. Once you set up port forwading in the firewall of the VM, you will need to create a .ssh config file on the machine(s) accessing the repositories on the non-standard port.

In your .ssh directory create a config file named config. Here you will place information for the host you wish to connect to. Essentially, you are creating an alias. The following is an example of an alias in a config file.

Host gitolite
 User git
 HostName somedomain.com
 Port 1234

After creating the config file you can now access your home network repositories using the aliased host name. For example, if you previously used git@localhost:reponame it would have connected using port 22. Now you can use git@gitolite:reponame and it will connect using the aliased host name and port. If everything else was setup correctly port forwarding will get everything connected properly and you will have outside access to your home repositories.