We all knew it was coming. As of July 14, 2015 Windows Server 2003 support is ending and Microsoft will stop supporting the platform. That doesn’t mean that Windows 2003 will not run – it means that any security holes, vulnerabilities, or exploits found on the platform will no longer be patched or fixed by Microsoft.
I know that change is always difficult, and no one likes going through the ordeal of moving a site and testing it. At DiscountASP we took a proactive approach. We began planning for this two years ago and we invested a great deal of time and care into helping our customers move to either a Windows 2008 or Windows 2012 server. The vast majority of the migrations were performed by the DiscountASP migration specialists, trained professionals with expertise in moving web site files to a new server.
We had over 60 Windows 2003/IIS 6 servers to retire, which amounted to over 6,000 sites.
Every site migration began with pulling up the site on the browser to ensure it was functioning and pointing to our web servers (and of course everyone was emailed prior to migration to let them know when it would be taking place). Then file migration from one server to another took anywhere from 40 minutes to well over an hour depending on the size and number of files. Once transfer to the new server was complete, we viewed and tested the site on the new server. If any code modifications or account configuration needed to be set, we took care of that, then emailed the site owner letting them know that their migration was complete.
Our migration specialists took the extra step of modifying account settings, application settings and connection strings for older .NET 1.1 applications that displayed an error after migration. Because of that kind of attention to detail, the vast majority of our members weren’t even aware of the actual move, and the entire process was transparent to them.
For those who took advantage of our temporary IIS 7 and IIS 8 testing platforms, we worked closely with them to move a copy of their site to the testing environment where they were able to fully test the compatibility of their web applications. Any code modifications, server or account settings that were needed were done in the test environment. When testing was completed we manually moved the site to the new platform. This eliminated any possible application disruptions that could have occurred during the migration to the new platform.
This was all done manually and free of charge, one site at a time. When everything was said and done, it took over 6,000 hours – or a full 250 days – to migrate the sites and finally retire the Windows 2003 servers. And finally, a statistic we’re very proud of: more than 99% of our customers did not experience any disruption of their websites.
DiscountASP.NET customers now have the peace of mind that their sites and web applications are being hosted on safer servers, and they also have greater server resources available to their busy sites.
Joomla is a Content Management System (CMS) that can be installed on a web site. It isn’t part of your hosting account by default, so if you haven’t installed it, you are not at risk for this particular security issue.
Joomla already has a filtering mechanism to prevent files with certain extensions from being uploaded through the application. However, the bug allows files with a trailing dot ( . ) to bypass the filter mechanism. Therefore, someone can create a file with a .php extension and include a period at the end, and the malicious file will be uploaded, after which the server will process the file as a PHP application. For example, normally Joomla will not allow a file called somefile.php to be uploaded. However, somefile.php. with a trailing dot, can be uploaded to the affected versions.
Versafe, an online fraud protection company, discovered this serious exploit. If your Joomla application becomes ompromised, your application either can become a phishing site, redirecting browsers to a malicious site that can steal personal information or it can become a repository for malware and Trojan programs infecting anyone who calls on that page.
To solve this problem you should download the latest version of Joomla, and upgrade to the latest version. If upgrading is not an option for you, you can include a line of code that will automatically strip the trailing dot ( . ) from the file name before upload begins, so the upload cannot bypass the media management filtering mechanism.
The file you will need to modify is the file.php file. The file is located under Libraries/Joomla/FileSystem. Within the function makeSafe, add the line:
// Remove any trailing dots, as those aren’t ever valid file names.
$file = rtrim($file, ‘.’);
If this line already exists under file.php, then the exploit has been closed and your Joomla application should be immune from this security hole.
What DiscountASP is doing
On our end, as of today we have updated our Web App Gallery to the latest version of Joomla that already has this patch. Therefore, if you downloaded and installed your Joomla application through our Web App Gallery today, your Joomla application should be protected from this exploit.
However, this security hole is considered to be a “Zero-day exploit” which means that the vendor of the application was unable to react before millions of Joomla applications became susceptible to the security threat. If your site runs the affected versions of Joomla, the chances that your web application is vulnerable to this threat are high and you should take immediate action.
I also encourage you to read these articles for more details on the Joomla exploit.
When people want to start incorporating e-commerce activities in their site, they must be PCI compliant to do so. There are many different companies/organizations out there that can help you determine if your web site is PCI compliant.
Many criteria must be met to be PCI compliant. One of those criteria is to setup a custom error page. Custom error pages are important because it hides the true error your application may display. The true error messages can be used to dissect and infer the back end design and structure of your web application. Information that can be used to find weaknesses and exploit your web application.
In IIS, there are two types of custom error handling that you can and will need to set on your site. One is the traditional IIS custom error that is normally processed on the IIS level. The other is the ASP.Net custom error handling that is processed on the ASP.Net application level. ASP.Net custom error handling is a bit tricky due to how it is processed through the handlers and depending on where it bombs out, can ultimately determine whether your ASP.Net custom error triggers.
In most cases setting up your ASP.Net custom error handling is straightforward. To achieve this, you simply add the “customErrors” element to your application’s web.config file. The main attribute you will need to ensure is the “mode” attribute within the “customErrors” element. You have three distinct choices. They are “On”, “Off”, or “RemotelyOnly”. You want to set it either for “On” or “RemoteOnly”.
Now here’s a caveat to being PCI compliant, some services will scan your site and test specific conditions to determine whether you have custom error handling enabled. And one of the conditions they may throw at your application is passing a value/URL to your site that is similar to :
which will not display your custom error page. However if you call a file that you know does not exist in your site:
you will find that your custom error page is displayed.
Confusing? Take a short breather and let me quickly explain.
Your ASP.Net custom error handling does work and the reaction your site invokes when someone calls on your site with the condition of “/[fakefile]?aspxerrorpath=/” is actually by design from Microsoft. This condition in the URL request will stop the process from reaching the custom error handler thus bombing out before it can display your custom error page.
So what’s the solution? The only work around that we can find is to setup a RequestFiltering rule to filter out the string “/[fakefile]?aspxerrorpath=/” from the URL string being called to your site.
In your web.config file input this code in your RequestFiltering element.
<requestFiltering> <denyQueryStringSequences> <add pre="" sequence="<span class=" hiddenspellerror="">aspxerrorpath=" /> </denyQueryStringSequences> </requestFiltering>
Another way to implement this rule is through the IIS 7 or IIS 8 Manager using the RequestFiltering module and setup the rule under the Query Strings tab with the query string “aspxerrorpath= “ set to Deny.
This will block the URL string “/[fakefile]?aspxerrorpath=/” whereby forcing it to call upon the file directly.
We have seen an influx of attacks against WordPress sites. The attack is an old method called brute force attack. The main targets are WordPress sites that still use the default administrative login “Admin.” With half of the credentials pretty much solved, the attacker repeatedly inputs a password until it finally finds the right one.
This lapse in security has been well known in the WordPress community. It has been asked by Tony Perez why WordPress themselves have not offered a stronger password restriction and require that the Admin login be changed; the response he and the WordPress community received was “it’s just not a relevant issue.”
The fix for this is fairly simple. First make sure you update the administrative credential from the default “Admin” user name to something more personal. Second step is to update the password to be more sophisticated. It is recommended that you have a minimum length of 8 characters, including letters, numbers, and special characters such as “#”, “$”, or “%”. Incorporating lower case and upper case characters in your password will also help strengthen it.
The exploit has had a substantial impact on web hosting companies like DiscountASP.NET. When a personal computer gets compromised, there is a limit in the bandwidth that computer may have, but with a web hosting company the bandwidth is almost unlimited. When a WordPress site is compromised, the hacker then uses that site to send out attacks on other servers and hosting companies.
With the nearly unlimited bandwidth at their disposal, the effects can be devastating. The owner of the account is affected as well. With high bandwidth consumption, they may be charged to pay extra for the bandwidth usage their WordPress site utilizes.
Another security measure that can be employed to mitigate this attack is to incorporating WordPress 2 step authentication. This is an optional new feature you can enable for your WordPress site. It uses the Google Authentication App.
It is a second verification input on top of the password that obtains a random generated code from Google Authentication App. This verification code is updated every 30 seconds making it impossible to guess. You may want to read more on this new security feature on this WordPress link.
Make no mistake that WordPress themselves are taking this attack seriously, and the effects have been wide spread among many hosting companies.
If you want to find out more about this wide spread attack against WordPress sites, here are a couple of links that you might find helpful:
Coincidentally this attack not only targets WordPress but Joomla web applications as well. I did not research any Joomla attacks, but if you have a Joomla site and you are using it’s default administrative login “Admin”, you may want to update the login name, and provide it a more complex password just in case.
There are two ways to do this. The easiest way is through IIS 7 Manager. You can use this Knowledge Base article for details on how to connect to our web server with IIS 7 Manager.
Once connected, go to the “Handler Mapping” module. Click on “Add Module Mapping.
Enter the settings as you see them. Now you have your application set to push all *.js files to Node.js.
You do not have to use IIS 7 Manager to create this mapping. You can directly code it within your application’s web.config file. What you will need to look for is the ‘system.webserver’ element within the configuration of the web.config file and add the handler line:
<configuration> <system.webServer> <handlers> <add name="iisnode" path="*.js" verb="*" modules="iisnode" /> </handlers> </system.webServer> </configuration>
There are a couple of work-arounds to this:
1. Upload the .js file you want to be processed by node.js to its own subfolder. Then add the web.config setting to that folder.
2. You can specify which .js file gets processed by the node.js engine. In the path field, rather than implementing a wild card, you can set: path=”somefile.js”.
It is often said that if you do not want your information to be stolen, don’t put it on the Internet. However, the Internet has become an integral part of our lives, and we can’t help but post some kind of web site, blog, or forum. Even if you don’t tell anyone about your web site, once it is published it will eventually be discovered.
How, you ask? By robot indexing programs, A.K.A. bots, crawlers and spiders. These little programs swarm out onto the Internet looking up every web site, caching and logging web site information in their databases. Often created by search engines to help index pages, they roam the Internet freely crawling all web sites all the time.
Normally this is an acceptable part of the Internet, but some search engines are so aggressive that they can increase bandwidth consumption. And some bots are malicious, stealing photos from web sites or harvesting email addresses so that they can be spammed. The simplest way to block these bots is to create a simple robots.txt file that contains instructions to block the bots:
However, there are a couple of things wrong with this approach. One is that bots can still hit the site, ignoring your robots.txt file and your wish not to be indexed.
But there is good news. If you are on an IIS 7 server, you have another alternative. You can use the RequestFiltering Rule that is built-in to IIS 7. It works on a higher level portion of the web service and it cannot be bypassed by a bot.
The setup is fairly simple, and the easiest and fastest way to initiate your ReqestFiltering Rule is to code it in your application’s web.config file. The RequestFiltering element goes inside the <system.webServer><security> elements. If you do not have this in your applications web.config file you should be able to create them. Once that is created type this schema to setup your RequestFiltering rule.
<requestFiltering> <filteringRules> <filteringRule name="BlockSearchEngines" scanUrl="false" scanQueryString="false"> <scanHeaders> <clear /> <add requestHeader="User-Agent" /> </scanHeaders> <appliesTo> <clear /> </appliesTo> <denyStrings> <clear /> <add string="YandexBot" /> </denyStrings> </filteringRule> </filteringRules> </requestFiltering> <authentication> <basicAuthentication enabled="true" /> <anonymousAuthentication enabled="true" /> </authentication>
You can name the filtering rule whatever you’d like and in the “requestHeader” element you will need to make sure you define “User-Agent.” Within the “add string” element you’ll need to specify the User Agent name. In this example I set it to YandexBot which blocks a search engine originating from Russia. You can also block search engines such as Googlebot or Bingbot.
If you want to see if this rule is actually blocking these bots, you will need to download your HTTP raw logs from the server and parse them to look for the headers User-Agent. If you scroll to the left and find the headers SC-Status (status code) you should see a 404 HTTP response. In addition the headers will also carry sc-substatus which will be a substatus code to the primary HTTP response code.
Here is a list of potential substatus codes you may see when you impose your RequestFiltering rule.
I would like to introduce you to our staff members so you can see who is on the other side of the support tickets and forum posts. Today we have a support staff member who has been with us for almost a year now, Ray.
My name is Ray, and I am a member of the DiscountASP.NET technical support staff. I was promised booty, bounty, and a pony to work here so I accepted the offer. 😉
But on a more serious note, I graduated UCLA with a Bachelor’s Degree and worked as a systems/network administrator for the UCLA Fowler Museum for over 3 years before I decided to pursue some other interests of mine. I hold a number of technical certifications from both UCLA and Microsoft and have experience working with different technologies including Microsoft, Novell, and Unix/Linux. I also have some experience in designing websites and programming in languages such as Pascal, C++, C#, Ruby, and Unix/Linux shell scripting.
My former hobbies included watching anime, attending conventions, and gaming (please don’t mention MMORPG). My new hobbies include pumping iron at the gym and playing the guitar (acoustic).
My technical interests are Microsoft SQL Server, ASP.NET, and virtualization, and I hope to use all my skills and experience to provide the best technical support to our customers here at DiscountASP.NET
My name is Tonny, with two n’s. I’m the newest member of DiscountASP.NET tech support team.
I have an Associate Degree in computer science from Santa Monica College, and have been working as an associate in countless projects with a friend who is also a senior software engineer.
I have a background in Web design and programming, experience in E-commerce, and am interested in computer graphics and iPhone mods. I like playing PC games, listening to music, and cooking. I enjoy good food, good music, and of course a good companion. I’m very happy to contribute my experience and knowledge to help our customers have the best web hosting experience.