Yesterday, there was a special Lotto game in Belgium: 7.000.000 EUR as Christmas gift. If seems that too many visitors tried to check the online results. Game over! 😉
The RTBF website was in a strange state this afternoon: Due to their “End-of-Belgium” false announce yesterday, they were for sure slashdoted… I’m still trying to download the video 😉 [Update] ~4 hours later, same issue 😉
Today, I was scanning a web server with nikto. System admins have really a great creativity to find nice server tags 🙂 # telnet www.xxxx.be 80 Trying xx.xx.xx.xx… Connected to www.xxxx.be. Escape character is ‘^]’. HEAD / HTTP/1.1 Host: www.xxxx.be HTTP/1.1 200 OK Date: Tue, 12 Dec 2006 14:34:44 GMT
Read on CNET News: Google is testing new search engines feature via an unbranded web site: searchmash.
I found a nice tool: the URL Checker (no not confuse with my URL watcher 😉 ). It scan the provided URL and display its structure “as a search robot”.
For those who like 90’s music: Frequence3 Only 90’s.
For years, Webmasters always faced a major problem: How to be sure that data provided in an online form come from a “human” and not a bot? (ex: to extra contacts of databases or to flood with false data)
google became officially a verb in the Webster dictionary!
WordPress has a nice feature called “Permalink“. This can improve the visibility or efficiency of your links (http://blog/monthly/post/13 is much nicer then http://blog?p=13). To achieve this, WordPress generates a .htaccess file to upload in your blog root directory. It processes all HTTP requests via the mod_rewrite module. Problem: all URLs