Update on Apache NonSSL and Share Servers Solution

Some months ago, I wrote an article on the difficulties I was having due to my hosting an SSL site on a shared provider server using a single SSL site to host multiple domains belonging to many different customers. Visit this LINK to get the details.

I coined my own term of “Jekyll and Hyde SSL Bug” to describe the issue with these shared servers and the SSL problems that occur on hosted sites. This resulted in a new web site to document the problem in further detail, calling out the industry and Apache for having created this issue in the first place.

The upshot is that while my first solution works for preventing my site from being visited by different domain URLs, it does not completely solve the problem of different indexing engines from cataloging content, effectively bypassing my solution. I was still seeing my site pages showing up on some sites and theorized that index engines may be able to index content even with the SSL mismatch in place.  My original solution had solved MOST of the issues, but not all of them.

The content snapshot below illustrates the continuing issue:

 

 

 

Again – to reiterate: My theory is that Google and other search engines may completely ignore SSL mismatch warnings and index the content irrespective of the mismatch. That is unfortunate and illustrates yet another potential problem in how the web indexes things. I’m not completely sure this is a fact, but I realized that a solution might be in order if I could tell .htaccess to use an alternate robots.txt file for domains that were not my own!  I came up with the following .htaccess code to solve the issue:

# Prevent robots.txt execution from unwanted domains from indexing. This should remove any sites from indexing by telling the relevant servers to stop  indexing anything OTHER than afterburner1.  – Rule installed 3/8/2018 – check again in one-quarter to check results.
RewriteCond %{HTTP_HOST} !^(www\.)?afterburner1\.com
RewriteRule .* noindex-afterburner.txt [L]

I created the noindex-afterburner.txt file and populated it with the following:

User-agent: *
Disallow: /

Voila! I tested the site and all appears well. Now to wait on things and how it goes.

I’ll report on how well this works and if the indexing issues disappear over the next few months, I’ll consider this a final success.