A Distributed Denial of Service (DDoS) attack is where many geographically disparate, compromised hosts attack a single target. The flood of data sent to the target overwhelms one or more system resources (CPU, bandwidth, available connections etc.) thereby denying service to legitimate users of the targeted system. These types of attacks can quickly bring a target network or system to its knees, are almost impossible to prevent and are becoming more and more common.
In line with Net Logistics’ goal of continually improving the service we provide to our customers, we have tested and deployed a comprehensive DDoS protection and mitigation solution.
If Net Logistics detects a DDoS targetting a system within our network, the traffic to this target IP is diverted to our DDoS protection system. The system discards the DDoS traffic and the “clean” traffic is allowed to continue to the destination server, which remains online despite the attack. This DDoS mitigation is typically enabled for 24 hours and if the attack is still ongoing, the process continues for another 24 hours and is repeated until the attack is over. Once the DDoS ends, the routing is automatically restored to normal.
All of this is done completely transparently and without any action on the part of the customer.
As an interesting aside, shortly after deployment, this system underwent a “trial by fire” whereby a multi-gigabit DDoS was directed at our official cPanel update server. The attack was quickly mitigated without interruption to the service of Net Logistics customers or the target system!
Net Logistics will be listed under ISC, APMG and JAS-ANZ as an ISO 20000 certified organisation.
ISO/IEC 20000 is an international standard that allows organisations to demonstrate excellence and prove best practice in IT service management. The standard allows IT service provider organisations to achieve conformance to a service management system which requires them to continually improve their delivery of IT services. It aligns to the IT infrastructure library (ITIL) best practice framework.
Further information can be found at the following links:
Even after all this careful planning, occasionally circumstances conspire to thwart these efforts and regardless of the cause, we understand the need to be transparent with our clients when issues do arise. We need to keep them updated with what is happening and what we are doing to rectify the problem. In doing this, we help to retain the confidence of our clients whilst at the same time minimising load on our support staff. This helps to avoid increased response times on phone calls and support tickets unrelated to the service interruption.
Until now, all service announcements were placed in an “Announcements” subforum at https://forum.netlogistics.com.au. In time, it became apparent that a number of issues existed with this approach, with the most obvious being that the forum is hosted within our own infrastructure and could potentially be unavailable to our clients if we ever suffered some kind of large scale service interruption.
To mitigate the issues we identified with our previous system, we’ve implemented a new Service Status website at http://www.netlogistics.info/
This website provides announcements for any service interruptions across Net Logistics’ infrastructure and is completely independent of all Net Logistics systems and so will not be affected by any outages we may be experiencing. We recommend that all Net Logistics clients bookmark this page.
Now to get into the meat of what this post is about, that being updates that are being performed across various platforms.
On the billing side of things, some of you may have noticed the roll out of our new system, which is primarily used for domains at this moment. This new system is being actively developed and used in the management of domains and we will be rolling out updates soon to incorporate the existing billing accounts and packages across, however this functionality does require a large amount of time and planning to migrate people across and ensure a smooth transition.
On the technical side of things, many of our Kinetic, Momentum and Dedicated cPanel clients will start to see the rollout of the new WHM interface, which has changed somewhat more drastically compared to previous releases. The new interface has been updated based on usability that is especially useful for tablet devices and you can see more at cPanel (http://docs.cpanel.net/twiki/bin/view/AllDocumentation/WHMDocs/WHMUIChanges).
There will also be some updates coming to the main Net Logistics website, with an updated service status page in the works, providing updated information publicly compared to the existing method of forum announcements. We are also looking into mailing list announcements for both technical notices as well as providing hosting sales to our existing clients.
Now to get back to work and look into more up and coming tech.
Disclaimer: no tech were hurt in the production of these team days (much).
The server provided to Net Logistics came with thirty two 8GB RAM modules and four AMD Opteron 6180 SE Processors, with each CPU having twelve cores. So the question was – what were we going to do with this beast of a server with 48 CPU cores and 256GB of RAM? As you can imagine our technical team were just itching to get their hands on this and put it to work! We decided that we would use this server primarily for testing rather than deploying any live systems on it. We also decided that we would be testing three different layers on this server – hardware, virtualisation and applications.
Net Logistics is primarily an Intel environment and the vast majority of our servers run on CPUs produced by Intel so we were really interested in testing the AMD CPUs in this system to see how they fared. Obviously we were interested in testing the raw performance of these CPUs but our primary concern was to test their heat output and how well the server could dissipate the heat which was produced. Due to the density of the server componentry and the minimal space that this allowed for good air flow we certainly had our doubts.
We employed a variety of benchmarking and stress testing software across both Linux and Windows and we maxed out all 48 cores for up to 48 hours at a time. What we found was that no matter what we did, we could not get CPU temperatures to rise above 61°C and all other chassis temperature readings were well within acceptable limits. We were pleasantly surprised!
Since this server has been built by Dell from the ground up with virtualisation in mind we wanted to test a variety of Virtualisation software to see how it performed. We ended up testing VMWare ESXi, Microsoft Hyper-V, Citrix Xenserver and KVM (for Kernel-based Virtual Machine), which is the default virtualisation technology used in a number of major Linux distributions. The testing that we did on Hyper-V on this server ended up playing a direct role in our eventual adoption of this technology for our new “Ascend” Windows VPS Packages: http://www.netlogistics.com.au/hosting/vps/windows/
This server now and for at least the near future will be used for testing operating systems and applications. Once our virtualisation testing was complete we decided that this server would be running Citrix Xenserver in order for us to continue with our on-going in-house testing. It is currently running numerous virtual machines across a variety of different operating systems and we are using it to test many different classes of software and services such as hosting control panels, database servers, HTTP servers, high availability, clustering and many more.
This on-going testing will allow Net Logistics to keep abreast of the latest software developments and assess their potential to both provide our clients with a wider array of services and to improve upon the services we have already implemented.
- Please let me take this opportunity to remind you to use the helpdesk when submitting tickets. Try and avoid sending an email to submit a ticket. In the near future, we will be upgrading our helpdesk software, and we will be disabling email communication. There will be email notification, so you will receive the replies via email, but customers will be unable to submit a ticket via email. They will need to log in to the helpdesk interface to submit a ticket. This removes confusion as to whether we have received a ticket or not, and it is also more secure than email as we force the helpdesk to load using SSL.
When we tried to implement similar field types for two separate forms within the same domain, what we found completely surprised us. Most browsers use the text description within the <form> tags to identify each field. This means that if you have two sets of forms, asking for the same type of data by text description, for example username and password, then the browser will attempt to autofill all forms under that domain with the data that may have been saved for another form. Most browsers completely ignore the fact that the field name/id is completely different in both forms.
Why is this relevant/important? Let’s say you have a billing system which allows the user to log in and check their invoices. At the same time, you have an option for that user to sign up for a new service and you ask for that user to provide a username and password for their new account. Most browsers will fill the username and password field with irrelevant data even though the html names of those fields are different to the login form. Not only is the data irrelevant but it is certainly can become a security issue if the multiple forms on a site are designed incorrectly.
There is a work around though. It is possible to tell the browser not to allow the “remember password” feature on certain forms. To achieve this, simply add the following code to the opening form tag:
This, however, is only an official standard for HTML5, although from our testing most browsers will co-operate regardless of the document type. This technique can also be used to stop the browser from saving sensitive information such as credit card numbers.
On the client end, it is never a good idea to save passwords into a browser anyway. The data is not encrypted and can be easily viewed by anyone that has access to the browser. Instead, use tools such as Keepass (Open Source) or 1Password (Commercial) to store sensitive information such as login details.