Google have recently implemented site preview on it’s search results pages.
- Great feature!
- Implemented wrong. It feels completely wrong. I mean the behavior.
I wonder how much time it will take to fix it.
Google have recently implemented site preview on it’s search results pages.
I wonder how much time it will take to fix it.
Seriously?
What exactly they have been thinking?
No wonder I’m always confused with all these Order, Allow, Deny.
The documentation reads:
Note that all Allow and Deny directives are processed, unlike a typical firewall
Isn’t it like reinventing the wheel but making it square?
Could anyone explain why would someone complicate such simple thing as access list? Please …
Most of the sites use "click here for/to …" wording. That implies that:
Put the following in your .vimrc file and you are set:
au BufNewFile,BufRead *.yaml,*.yml set et ts=2 sw=2
When you will be editing YAML files, you’ll automatically have the following behaviour:

Debian testing just got kenrel 2.6.30. The previous version was 2.6.26.
I will summarize here the new features for the upgrade but only the ones I will find interesting.
Hi.
Some time ago I started using clive. It was very frustrating without a GUI, which I could not find. Here is my hack, “clive gui”, that I thought I was not going to publish. It was only tested on my system.
Use at your own risk!
Here is a checklist of knowledge and abilities which I consider a must for a good web programmer.
If it’s someone expected to work with Linux the following applies:
I’ll be adding to the list as soon as I remember anything important enough.
Suggestions for additional points are welcome.
SMTP is a text based protocol. I already mentioned that text protocols are evil. It was phrased more nicely in previous posts. Well it can no more be phrased nice. This is stupid! In SMTP it means that binary attachments are encoded using base64 which is part of MIME. Any binary attachment (image, presentation, document, …) you’ve ever sent takes more space in the email. That means it’s slower to send, slower to receive, and wastes more space on the server. The space some people still pay for. I have to remind the reader that such encoding requires additional CPU cycles to handle which in turn increases electricity bills’ totals.
And the additional bonus: ever heard “I have 10M email box but I can’t get an email with 8M attachment from my friend. What’s the problem?”. The correct answer would be the stupidity. Wasted tech support time. They have to explain that 8M attachment can not fit in 10M email box. Try sometime to explain this to someone. Have fun. Extra bonus: Someone pays for this tech support wasted time. Exercise for the reader: figure out who’s paying.
If I had the powers, I would make it unlawful to use FTP. One of the troublesome protocols. Let alone it’s text based, the semantics are totally screwed. Active and passive mode. Yeah, that totally solves all the problems, right. Especially the 2 sockets (network connections) for file transfer. Is it intentionally so f*cked up to make firewall software much harder to get right? In short, it’s broken. Don’t use it. Let it die slowly.
Use SFTP wherever you can. If you are a system administrator, make the world a favour: never enable FTP on your servers.
Why in the world would one want to use text-based protocol? Really. WTF Dudes?
Yes, you can telnet a server on port 80 and debug… maybe. That’s about it.
Wikipedia says: “Binary protocols have the advantage of terseness, which translates into speed of transmission and interpretation”.
Lower costs would be caused by: less electricity used, cheaper hardware at the ends and along the way, less bandwidth.
I would also expect programs to be written in better ways just because of handling a binary protocol. A special library would always be used (I hope). There would probably be less stupid Perl scripts each implementing their own parsing of the query string, HTTP headers, and MIME POST body instead of using existing libraries. It would be much harder. There wouldn’t be less stupid people though… I mean that the same people that wrote those scripts would write some other stupid scripts.
HTTP does not support two-way communication in the way required for current internet applications. Wake up! Internet is mostly about applications these days and much less about documents.
Unfortunately I guess we are stuck because of the costs of upgrading to something better. I predict that we will continue to see increasing number of clever hacks to overcome the limitations of this pre-historic protocol.