MySQL and File System Backup to FTP Server

Posted by reto on 15 May, 2011 23:45

I've put some shell scripts together to automate the backup process of my root server (basically a LAMP system).

You can read the how-to and get the download links to the backup scripts at mycodedump wiki.

For those how just want the code, fast - here you go:

Any comments or suggestions for improvements etc. are welcome! 

Back again?

Posted by reto on 08 January, 2010 21:59

Hefty! It's been quiet here for almost 2 years, which feels like 20 on the internet, but then Google still looks the same as on the 24th of April 2008, right?

Well, I've started using Google Code together with Mercurial repositories to manage some of the codings I did. Most of them are probably not very useful, but then again, they where useful for me at some point in time and could be a starting point for others. Furthermore, using a Google Code project is a great way of backing up stuff that needs no privacy ;).

My code dump has two repositories so far:

  • CMS Made Simple stuff (like plugins, tools etc.), my CMS of choice for small websites.
  • Shell Scripts I wrote. Some to integrate with Nautilus, some to be used on the command line
BTW: I'll post important updates regarding mycodedump on this blog.

 

PHPSecInfo

Posted by reto on 24 October, 2006 20:02

PHPSecInfoThe PHP Security Consortium has release v0.1.1 of their PHPSecInfo tool. From their website:
The idea behind PHPSecInfo is to provide an equivalent to the phpinfo() function that reports security information about the PHP environment, and offers suggestions for improvement. It is not a replacement for secure development techniques, and does not do any kind of code or app auditing, but can be a useful tool in a multilayered security approach.
As PHPSecInfo doesn't provide any new information, at least with this release, I see it as a useful tool for the one's that are not very familiar with php and only want to set up some downloaded scripts on their own web server. What I'd like to see in upcoming versions is a LOT more verbosity. Explaining the settings in depth and giving advices on secure programming linked to some of the settings they test (like input validation without magic_quotes_gpc, handling globals with globals off etc.)

Speed up your Website

Posted by reto on 11 February, 2006 13:05

Compressed Websites are fasterAfter reading the discussion about php-based gzip compression of pages LifeType is serving, I thought I should dive deeper into the server based compression possibilities of websites. Read on for a short comparison between compression with PHP and compression with Apache2.

 (More)

Why I hate Internet Explorer

Posted by reto on 02 February, 2006 23:19

I was trying to display a menu I created with a listing correctly in IE for about 1.5h googling myself to death and trying about every hack, until I found out that:
This is called the IE Whitespace-in-Lists Bug, and it's a result of placing a block level anchor in the list item. The simple fix is to set the <li> to display:inline. This cures the bug without any side effects in compliant browsers.
This basically means that if you have something like:
<ul>
 <li><a href="#">Link 1</a></li>
 <li><a href="#">Link 2</a></li>
 <li><a href="#">Link 3</a></li>
 <li><a href="#">Link 4</a></li>
</ul>
and define a { display: block; } IE will create an extra new line between each listing.
Yes, the fix was indeed simple, I was just quering for the wrong keywords for too long... Thank you, IE!

www. is deprecated!

Posted by reto on 18 June, 2005 18:10

My site (hugi.to) validates as no-www.org Class B!www. is deprecated is what it's all about on no-www.org. And I think they are absolutely right. "www." is 4 chars too much to be entered to reach any top level domain. My hoster did a reasonable setup of my domain by allowing http://www.hugi.to as well as http://hugi.to. The only thing I had to do to get the Class B validation is putting these lines into my .htaccess file:
# no-www.org setup
RewriteCond %{HTTP_HOST} ^www.hugi.to$ [NC]
RewriteRule ^(.*)$ http://hugi.to/$1 [R=301,L]
If you'd like to do the same on your site, be sure you do it in the top level folder (aka web-root). The rule will be inherited through all subfolders with one exception: subfolders which already have some special url_rewrite rules in place (like pLog has) will not inherit the rewrite rule. Make sure to update these .htaccess files with the rewrite rule and don't forget the slightly different rewrite code:
RewriteCond %{HTTP_HOST} ^www.hugi.to$ [NC]
RewriteRule ^(.*)$ http://hugi.to/blog/$1 [R=301,L]
 (More)

Changed to another Weblog Software

Posted by reto on 30 April, 2004 20:25

Some may already have noticed: I switched my blog from Serendipity to pLog. I'm still using one of their standard templates/skins (a Movabletype clone), but I'm planning to work on my own template really soon. But first thing I did after switching to pLog, was writing a mod_rewrite engine to have search engine friendly URLs much like those possible with serendipity.

Some might wonder, why I switched over to pLog in the first place, if I had to implement a feature S9y already supported. Well, as usual things are not as black and white, neither with blogging tools nor anywhere else. ;-p
pLog has a much more modularized framework, is consequently object oriented and has a strict separation of presentation logic, business logic and data. It uses the famous Smarty template engine for it's output and a lot of other state-of-the-art PHP classes. For example: phpMailer for all mailing purposes, ADODB as database abstraction layer, getID3 for metadata extraction in resources and MagpieRSS to parse RSS resources. Plus there are tons of features in the admin panel, most of them I didn't test so far. And it has a very nice, though still under heavy developement, plugin framework. I simply feel much more comfortable if I have a tool which can easily be modified without the need for changing core code all the time. It makes updates so much more fun. ;)

The mod_rewrite requestgenerator will make it into the next version of pLog, so I desided to wait until the next official update until I change to the new urls. As usual the old URLs shouldn't be broken after the update.

1 2  Next»

Categories

Links

Recently...

Recent Comments

Feed URL

Archives

Syndicate

Useless Info

Bad Behavior has blocked 80 access attempts in the last 7 days.