Website Baseline Controls Deficiency

Few days ago, my team completed the audit of a third website in 2weeks. The websites belong to organizations in the financial, educational and public sector. Some baseline audit findings across board revealed that:

 

1. the websites did not have SSL certificates installed. 

While some say SSL certs are only needed when you have a form on your website or transmit sensitive information, Google Inc. made it a security baseline for websites. Sites without SSL certs are classified as unsecured on chrome browsers. The effect of such classification is directly proportional to the perception of the brand. 

However, security is not about achieving a 100% secure environment. The goal is just to make things difficult for the adversary and installing a SSL certificate is one sure way of achieving that.

 

2. There are no tools to monitor controls and intrusion attempts. 

If it cannot be measured, it cannot be managed. Just because a website or web application maintains a reasonable amount of uptime does not mean everything is alright. Defacement of websites is not the only attack that indicates if a website has been breached. There are advanced persistent attacks that could stay hidden for months or years. A motive could be to continually steal data as records are updated. Gone are the days when web designers and developers build websites, hand them over to the owners and everyone forgets about its day to day administration. 

A baseline control would be to implement a web application firewall. Exception logs would provide valuable insights into who, how and when intrusion attempts are made (honeypot). Additional features include the protection from DOS, SQL injection, roles & privileges, malware, etc. 

If your organization’s website is powered by content management systems (Wordpress, joomla, drupal, etc.), acquisition of a web application firewall over the counter will be cheaper than developing one.

If your website codes were written from scratch, a cost benefit analysis between acquisition and development is recommended. 

 

3. Website backups are not automated, periodically created and tested. 

Having a replica of your website Is important to attain resilient status. Security is not guaranteed. Hence, the need to have a test environment where backups can be tested and replicated. CMS users can use Xampp or Wamp to create the test environment locally.

A limitation reported by an auditee during an interview is the speed of internet bandwidth which hinders the creation of successful backups.  If you experience a related issue, you can consider integrating an automated backup plugin for Content Management System (CMS) users. Schedule your cron jobs and that’s it.

If the site was built with lines of codes, kindly draw the attention of the developer to the backup module. Review the backup process flow and function. Create backups and test to ensure its functionality and reliability, should a disruption occur. 

 

4. Files and folders permissions were not adequately set. 

Some files and folders were set to 777. This means that all group of users (administrators, public) can read, write and execute. This is a source of weakness in a system. 

By default, files and folders permissions should be set to 644 and 755 respectively. A file transfer protocol (FTP) client like FileZilla will do the work for you. You could also login to your hosting panel manager to set it. However, a web application firewall described in item 2 should have the capacity to set permissions as a group. 

 

5. Websites were not updated regularly. 

Open source CMS websites are prone to bugs. Best practice requires that regular updates are done once the update manager sends a prompt or notification. However, this should be after reading and understanding “what’s included in the update”

If a vendor or web developer sends an update notification, read and understand “what’s included in the update”, revert to your test environment, create a replica of the production website and update the website. On successful completion of the update, review the codes for backdoors and other malicious lines of codes. 

 

Auditors are required to understand and perform most of these tasks to have a first hand experience on the state of controls/security of the website and not relying on the judgement of the practitioners. These processes should be thoroughly reviewed by the auditor before digging deeper into other parts of the system. 

Connect

Stay With Us

Subscribe to our news letter to get the lattst
new on Business