Web pages in separate repo
#11
What is that robots.txt for, exactly?
Reply
Thanks given by:
#12
The robots.txt ban the robots sniffing the websites.
Reply
Thanks given by:
#13
And why would we want to ban robots?
Reply
Thanks given by:
#14
Because the robots bad use hugely bandwidth.
Reply
Thanks given by:
#15
They use bandwidth in order to bring more people to us by showing us in their search results. I don't see anything bad about it, the bandwidth usage is minimal, especially with our single-page site that has almost no links. Additionally, the bandwidth is provided by GitHub and they should be the ones to worry, not us.
Reply
Thanks given by:
#16
He's not talking about bots like the Google bot which doesn't generate as much traffic. He's talking about bad bots which scan the webpage in order to make it slow.
Reply
Thanks given by:
#17
Oh yeah, and a text file is gonna stop those bots, yeah, rightTongue
Reply
Thanks given by:
#18
Yes... it's like the 5-second-rule where all the bacteria is held up for 5 seconds by the major bacteria saying: "Burrgrgr.. we have to ... burggrg ... wait 5 ....burggrg ... seconds .. bugrgr..." before they start trying to infect what you just dropped on the floorBig Grin

I just researched a bit. In fact, Github has it's own Robots.txt for the whole webpage and other mechanisms for rejecting bots.
Reply
Thanks given by:
#19
I have moved the pages to a separate repo: https://github.com/mc-server/mc-server.github.io

The change didn't even need to involve FakeTruth setting anything up for the domain, it just worked out of the box Smile
Reply
Thanks given by:
#20
I'm wondering if the mc-server.org supports PHP. If so we might be able to make it show the actual MC version of the build instead of manualy changing it. It's not big, but could fix confusion if we forgot to update the site upon a new Minecraft release.
Reply
Thanks given by:




Users browsing this thread: 1 Guest(s)