MySQL and File System Backup to FTP Server

Posted by reto on 15 May, 2011 23:45

I've put some shell scripts together to automate the backup process of my root server (basically a LAMP system).

You can read the how-to and get the download links to the backup scripts at mycodedump wiki.

For those how just want the code, fast - here you go:

Any comments or suggestions for improvements etc. are welcome! 

Back again?

Posted by reto on 08 January, 2010 21:59

Hefty! It's been quiet here for almost 2 years, which feels like 20 on the internet, but then Google still looks the same as on the 24th of April 2008, right?

Well, I've started using Google Code together with Mercurial repositories to manage some of the codings I did. Most of them are probably not very useful, but then again, they where useful for me at some point in time and could be a starting point for others. Furthermore, using a Google Code project is a great way of backing up stuff that needs no privacy ;).

My code dump has two repositories so far:

  • CMS Made Simple stuff (like plugins, tools etc.), my CMS of choice for small websites.
  • Shell Scripts I wrote. Some to integrate with Nautilus, some to be used on the command line
BTW: I'll post important updates regarding mycodedump on this blog.

 

Googlebot and Site Redirects

Posted by reto on 02 October, 2003 19:45

At a first glance it seems like there is nothing on the web, that can hide from beeing indexed by Google. Not only html but twelve(!) other filetypes are getting indexed at the moment.
But Google is much pickier as one could assume, whereas its reasons are evident and reasonable:

1. Redirects
Googlebot (Google's Spider) doesn't follow the "http/1.1 302 Found" status code (resource temporarily moved). Instead you should use a "http/1.1 301 Moved Permanently" header to make Google follow the redirect.
To make the long story short: If you're using PHP to do the redirect (and many are using PHP these days) you should add the status code header manually because PHP sends a 302 Found status code by default.

This stops Google and therefore is only usefull if your site is really under maintenance at the moment:

<?php
    header
('Location: http://www.foo.com/bar/');
?>


This makes Google follow the redirect and index the Site: 

<?php
  header
('HTTP/1.1 301 Moved Permanently');
 
header('Location: http://www.foo.com/bar/');
?>


If you prefer to do the redirects within an .htaccess file (on Apache, of course), you could do it like this. Every Request to foo.com/ is redirected to foo.com/bar/: 

 #Redirect (this will result in a 301 permanently moved status code)
 RedirectMatch permanent ^/$ http://www.foo.com/bar/


I expect it's faster and less resource intensive to set up an .htaccess file because there is no need to parse any php code at all. Though it won't matter in most cases anyway. (untested assumption)

2. Sessions
Google doesn't follow links with a session attached. If you've enabled session.use_trans_sid in your php.ini you should check if Google is requesting the page. If your site displays fine without the use of sessions simply don't start one if google is visiting. ;-)

<?php
    
// session is not started to serve google
    
if( stristr$_SERVER['HTTP_USER_AGENT'], 'google') === false )
    {
        
session_start();
    }
?>


Add as many search engine bots as you like. A more sophisticated method (like regular expressions) is not needed here, but would of course work, too.

SpamPoison

Posted by reto on 29 July, 2003 19:27

SpamPoison generates virtually infinite numbers of bogus e-mail addresses to "poison" the e-mail databases of spammers using e-mail harvester tools to gather e-mails over the web.
It can be used to help reduce the problem of spam on the Internet in general, and at sites using SpamPoison in particular.

Each randomly generated website again generates random email addresses and content and links to its self with pseudo-hyperlinks, trapping badly engineered address harvesting web crawlers, and to fool them into adding enormous quantities of completely bogus e-mail addresses to the e-mail address databases of the spammers, thus polluting those databases so badly that they become essentially useless.

SpamPoison is a port of the Perl script 'Wpoison' by Ronald F. Guilmette.
Please visit monkeys.com/wpoison/ for detailed Informations on what SpamPoison is and why it could be usefull to you.
The original Wpoison site seams to have disapeared. At least I can't find anything but broken links to the original wpoison script. Well, for more information on how to use the script, please download the zipped package below. The included install.txt has some instructions on how to set up your own spam-harvester trap.

SpamPoison Demo
Download SpamPoison (zip, 759kb)

SmileWare for EPOC ER5 Devices

Posted by reto on 21 August, 2000 19:42

SmileWare is freeware for EPOC32 devices. Actually, all my SmileWare is tested on a Psion S5 and on the ER5-SDK, which means that SmileWare should be compatible with all EPOC Machines. SmileWare is absolutely free, but I would like you to send me a :-) or any comments and bug-reports!

Smileware is written in OPL, which recently was opensourced! Please visit opl-dev.sourceforge.net for further information. (More)

Categories

Links

Recently...

Recent Comments

Feed URL

Archives

Syndicate

Useless Info

Bad Behavior has blocked 80 access attempts in the last 7 days.