Setting Up Network Sites

Typically there are going to be two setup styles, Wayback archive creation as well as using some form of content management system (CMS). The most popular CMS used for network site building is WordPress.

A lot of sellers will try to earn your business by offering “diversity signals” by noting they use a CMS like Drupal. Don’t fall victim to this garbage ideology. The same person using this as a selling tactic will usually have them set-up as a skeleton style site because they don’t know how to use the platform.

WordPress continues to grow in popularity and usage, just use WordPress as your CMS. Codeinwp does a solid breakdown of this for 2018. If you want to avoid WP in general as a footprint, consider Hide My WP to remove the WP footprint.

The Setup Styles

Setting Up with Archive.org or “Wayback Rebuilds”

Chances are you have heard of archive.org. Most people refer to this style as a “Wayback Rebuild.” Essentially we are going to archive.org and we are selecting a period in time to rebuild the site from. Once the site is rebuilt we are going to wait for the site to index and then inserting our link into the html. 

There aren’t a lot of services that offer this. The ones that are “decent” usually require you to pay $10-20 per scrape/rebuild. I’ll take a hard pass. 

Archive Builds

If you opt for archive.org rebuilds, use this scraper and stop paying for individual downloads:

https://github.com/hartator/wayback-machine-downloader

You will need Ruby, download here: https://www.ruby-lang.org/en/downloads/

Download the WB scraper: https://github.com/hartator/wayback-machine-downloader/archive/master.zip

    1. Once Ruby is installed open “Start Command Prompt with Ruby”
    2. Next you want to change directory or CD to the directory where the script is located. This is the \bin folder. Your command prompt will look like this:

Once you’re in the right directory you need to navigate to https://archive.org to find your desired timestamp for scraping.

Let’s say your network site was https://www.pcmag.com/ and you wanted their setup from 2009. You would pick the date on the calendar and land with a URL like this: https://web.archive.org/web/20090211134535/https://www.pcmag.com/

The bit in bold is the timestamp. You will need this for the scrape.

In the command prompt window, you changed directory to before. You will then use the following code:

“wayback_machine_downloader https://www.pcmag.com/ -t 20090211134535”

It will then detail the number of items to download and it will start downloading those items. The files will land in the \bin directory under a new “websites” folder. Once that’s complete you simply upload the files to your server under the public_html directory.

Typically ill let this setup “marinate” and index for a week and then i’ll edit the homepage with my link.

Once everything is indexed, edit the page (index.html) and drop your link somewhere contextually on the page.

Live Example

The domain being used is: “lightmandalas.co.uk”

archive-example
Inserted bit of text "Hatred from Hatred.io"

Taking This Setup a Step Further

The whole point of archive builds is they are quick and easy to throw up and get hosted. You don’t need to pay for setup or even waste time with content. You simply run a tool, upload some files, and edit some HTML. 

This is basically as far as people go when it comes to archive builds. The next bit is just food for thought and ways to improve this that many aren’t opting to do. Possibly escape sandboxes before they are even formed?

You hear a lot of people mention hosting when it comes to PBNs. What if we could emulate the old host that was used? Same hosting, and possibly the same range of IPs? 

Hosting History from HosterStats.com

The above screenshot is from HosterStats.com, it’s probably the best site for finding DNS/Hosting history. I actually use this tool to check how many times a site has actually dropped when filtering domain purchases. 

Our archive above was from 2013. In HosterStats we can see host was “web-hosting.com” which is now Namecheap. We can see from HosterStats that they kept Namecheap as a host from 2013 until August 2016 when it was dropped. 

IP History from Viewdns.info

Above from http://viewdns.info/iphistory/ we can see records for 2015 and 2016. We can’t see 2013 when we took the archive, but we know they kept the same hosting from 2013-2016, which was Namecheap. 

So it would be safe to assume we could navigate to Namecheap shared hosting and get our site hosted with eNom, possibly even request the IP from one of those data centers.

Why Even Bother With this? The idea is that it looks better than ripping an archive and reuploading on some new shared host. Truth be told, webmasters come back to re-register domains with the same website quite often. Some even forget that their site dropped. What hosting would they opt for? They would probably work with something they are familiar with, in this case Namecheap.

I’m not saying that simply ripping an archive and re-uploading won’t work on a new host. I’m saying that everyone does that. Why be like everyone else? The cool kids suck.

Can You Do More?

There are more applications of this, but I’m going to leave it to you to think about. Possibly faking/emulating the previous who-is? Same registrar? MX Records?

How about dropped domains? I guarantee you have had a situation where your registration expires before the hosting is suspended. Usually, that information is kept for some time before they trash it. Social engineering the host in order to get the same IP/server/etc?

WordPress Builds

This is fairly straightforward. You are going to host your site and install the WordPress CMS.

There are a couple of different strategies to use here. The first being setup with full posts on the homepage, the second is to send links from pages. Patrick from serpchampion discusses this in more detail.

Before having these sites setup I make sure they aren’t toxic. Mainly because it can be a waste of money. Once they pass toxicity tests I send them over to my guys to have them set-up. There are a number of footprints to avoid as well, consider these when making your sites. Typically having unique hosting and setups is more than enough.

Rather than try to write a setup guide on how to create a single network site, I’d rather just show you what your end goal should look like. 

What I Decided

At the start of this year, I decided that during my toxicity testing across the network that I wanted to revamp every single site. As sites started to pass, they were sent to my team to receive a makeover. 

Hatred’s PBN Builds?

I’ve had several people message me about building them a network. Truthfully I don’t have time for this. I have my own network and a shared network that I actively build with a buddy. Instead, I decided to make a service since I don’t constantly have an influx of sites to be setup, and they truly outmatch what other people offer at a very competitive price-point.

Samples

I instructed my guys to create 5 sites in 5 different niches. These sites have no history so no Majestic TTF URLs can be created (they do that too). All I told them was a niche and told them I wanted full posts on the homepage (excerpts are fine too).

Truth be told, not a lot of people even know what a PBN or a network site looks like so I also thought this would be a good chance to demonstrate that. 

Finance: http://finance.hatredio.design/

Business: http://business.hatredio.design/

Health: http://health.hatredio.design/

Tech: http://tech.hatredio.design/

Travel: http://travel.hatredio.design/