02-15-2015, 10:25 AM
What is that robots.txt for, exactly?
Web pages in separate repo
|
02-15-2015, 10:25 AM
What is that robots.txt for, exactly?
02-15-2015, 07:51 PM
The robots.txt ban the robots sniffing the websites.
02-15-2015, 09:20 PM
And why would we want to ban robots?
02-15-2015, 09:22 PM
Because the robots bad use hugely bandwidth.
02-15-2015, 10:13 PM
They use bandwidth in order to bring more people to us by showing us in their search results. I don't see anything bad about it, the bandwidth usage is minimal, especially with our single-page site that has almost no links. Additionally, the bandwidth is provided by GitHub and they should be the ones to worry, not us.
02-15-2015, 11:47 PM
He's not talking about bots like the Google bot which doesn't generate as much traffic. He's talking about bad bots which scan the webpage in order to make it slow.
02-15-2015, 11:53 PM
Oh yeah, and a text file is gonna stop those bots, yeah, right
02-16-2015, 02:17 AM
(This post was last modified: 02-16-2015, 02:18 AM by sphinxc0re.)
Yes... it's like the 5-second-rule where all the bacteria is held up for 5 seconds by the major bacteria saying: "Burrgrgr.. we have to ... burggrg ... wait 5 ....burggrg ... seconds .. bugrgr..." before they start trying to infect what you just dropped on the floor
I just researched a bit. In fact, Github has it's own Robots.txt for the whole webpage and other mechanisms for rejecting bots.
I have moved the pages to a separate repo: https://github.com/mc-server/mc-server.github.io
The change didn't even need to involve FakeTruth setting anything up for the domain, it just worked out of the box Thanks given by: sphinxc0re
03-04-2015, 07:05 AM
I'm wondering if the mc-server.org supports PHP. If so we might be able to make it show the actual MC version of the build instead of manualy changing it. It's not big, but could fix confusion if we forgot to update the site upon a new Minecraft release.
|
« Next Oldest | Next Newest »
|