Chilli’s Indian Restaurant and Takeaway

Just about to launch a new website for our friends at Chilli’s Indian Restaurant and Takeaway in St Annes, Bristol.

MX, SMTP and Clever Email

This is meant as a basic guide to MX, DNS and SMTP. By following this guide, you should be able to manually send email directly to a mail server. Of course, this is pointless 99% of the time, but as an academic exercise it’s useful to know.

This guide is tailored to Gentoo, because that’s what I use. Most distributions include a tool “nslookup” by default. Gentoo doesn’t, so to install nslookup run:

emerge net-dns/bind-tools

Domain names configured to accept email have something called an “MX” record. MX records are a list of domain names that can accept email for the domain you’ve searched for. They’re listed by preference, 0 being the most preferred. Let’s send an email to trash@marcgray.co.uk: We need to find out what the MX records are:

nslookup
> set type=mx
> marcgray.co.uk
marcgray.co.uk mail exchanger = 0 marcgray.co.uk.
> exit

Looks like we only have 1 mail server, and it’s marcgray.co.uk. Now we need to connect to this and communicate with it using SMTP. You’re going to want a telnet program (this can be done with a variety of other tools, like nc) and again, Gentoo doesn’t install one by default so run:

emerge net-misc/netkit-telnetd

Now you’re all setup to communicate with the SMTP server at marcgray.co.uk. There are a handful of commands you need to know:

HELO name – This identifies you to the server and is the first thing you type. It’s best to use a real domain name here as some servers can reject the email if this doesn’t make sense.
MAIL FROM: email – This specifies the email you’re sending from. This should be a valid address, if only to receive a “failed” notification.
RCPT TO: email – This specifies the email you’re sending to. Most email servers won’t send to anything but what their MX records suggest they will. This is to prevent spam and abuse. There’s no point trying to send an email to wikipedia.org from the marcgray.co.uk server, for example.
DATA – This begins your email. You begin typing your headers, a new line, the content and finally a line containing only a “.” to tell the mail server you’re finished.

Lets look at a real communication, > denotes sent, < denotes received:

# telnet marcgray.co.uk 25
< Trying 109.109.251.242...
< Connected to marcgray.co.uk.
< Escape character is '^]'.
< 220-server01.lamped.co.uk ESMTP Exim 4.77 #2 Wed, 27 Jun 2012 22:44:23 +0100
< 220-We do not authorize the use of this system to transport unsolicited,
< 220 and/or bulk e-mail.
> HELO marcgray.co.uk
< 250 server01.lamped.co.uk Hello cpc10-hawk13-2-0-cust173.aztw.cable.virginmedia.com [62.30.157.174]
> MAIL FROM: YOUR-EMAIL-ADDRESS
< 250 OK
> RCPT TO: trash@marcgray.co.uk
< 250 Accepted
> DATA
< 354 Enter message, ending with "." on a line by itself
> Subject: Sample email
>
> This is a test email
> .
< 250 OK id=1Sk03D-0004P7-Rr
> QUIT
< 221 server01.lamped.co.uk closing connection
< Connection closed by foreign host.

Let's shorten that to what we actually sent:

# telnet marcgray.co.uk 25
> HELO marcgray.co.uk
> MAIL FROM: YOUR-EMAIL-ADDRESS
> RCPT TO: trash@marcgray.co.uk
> DATA
> Subject: Sample email
>
> This is a test email
> .
> QUIT

Assuming you didn’t send anything too spam-like, the message will be received. Obviously trash@marcgray.co.uk isn’t accepting emails, if you’re using that to test.

We can get clever with this…

DOMAIN=marcgray.co.uk; RECIPIENT=marc; FROM=”YOUR-EMAIL”; SUBJECT=”Test Subject”; EMAIL=”Some test email content”; HOST=$(printf “set type=mx\n$DOMAIN\nexit\n” | nslookup | grep $DOMAIN | grep -oP ‘[a-z0-9-\.]+\.$’); HOST=${HOST:0: -1}; printf “HELO marcgray.co.uk\nMAIL FROM: $FROM\nRCPT TO: $RECIPIENT@$DOMAIN\nDATA\nSubject: $SUBJECT\n\n$EMAIL\n.\nQUIT\n” | telnet $HOST 25;

Lets split this up by semi-colon and explain it, since a one line bash command of this complexity isn’t generally too readable:

DOMAIN=marcgray.co.uk
This sets the domain name we’re looking up, the part after @
RECIPIENT=marc
Section before @ in the email
FROM=”YOUR-EMAIL”
Your email address
SUBJECT=”Test Subject”
The email subject
EMAIL=”Some test email content”
Email contents. You can supply this in a number of ways. An easy one-line way of doing a multi-line email is with EMAIL=$(printf “Line1\nLine2″)

HOST=$(printf “set type=mx\n$DOMAIN\nexit\n” | nslookup | grep $DOMAIN | grep -oP ‘[a-z0-9-\.]+\.$’)
This does the MX lookup and captures the result for the above DOMAIN
HOST=${HOST:0: -1}
This takes the terminating period from the end of the nslookup output
printf “HELO marcgray.co.uk\nMAIL FROM: $FROM\nRCPT TO: $RECIPIENT@$DOMAIN\nDATA\nSubject: $SUBJECT\n\n$EMAIL\n.\nQUIT\n” | telnet $HOST 25
This combines the SMTP commands with the above options, connects to an MX record SMTP server and sends the email to it

This single line solution won’t work for every email server out there, but it will work for most. More importantly, it’s a fascinating look at the simplicity of SMTP commands and the power of bash.

Bristol and Bath Linux User Group

It’s been a while coming, but the Bristol and Bath Linux User Group (LUG) that I’m a part of now has a website.

http://www.bristol.lug.org.uk/

Thanks to everyone who helped bring it about.

libtool version mismatch error

I’ve recently switched from Xubuntu to Gentoo, and I couldn’t be happier. I’d recommend it for anyone sufficiently technical with strong knowledge of Linux. I’ve found it to be fast, efficient and bloat-free. There are a few glitches to overcome, but they can almost always be handled with a few config file changes that are well documented across the internet.

As a PHP developer, tools such as xdebug are important to me and I really want to make the most of vld – thanks Derick, you’ve written some great tools. I had issues installing xdebug and vld in Gentoo and the solution wasn’t as clear-cut as usual. To help others in the same situation, I’ll document the solution.

If you’re getting:


libtool: Version mismatch error. This is libtool 2.4, but the
libtool: definition of this LT_INIT comes from an older release.
libtool: You should recreate aclocal.m4 with macros from libtool 2.4
libtool: and run autoconf again.

Try running:


phpize
aclocal
libtoolize --force
autoheader
autoconf

You should now be fine to run configure and make.

Opera

Once again I was checking StatCounter – it’s my source for knowing which browsers I must test in. On that note, Woo hoo! Internet Explorer 6 is about 1% in the UK! RIP, please. The point however, is Opera.

There are a few trends on the statistics and popular sites such as ZDNet: It’s all about Internet Explorer, Firefox and Chrome. Safari gets a look-in from time to time, but largely because it’s the forced browser on the iPad and iPhone which inflates it’s popularity (say, anyone remember the Microsoft anti-trust case surrounding Internet Explorer a while back?). What about Opera?

Let me tell you a few things about Opera, from a guy who uses every rendering engine, and a number of their UI shells*:

Opera has average performance for all tests in cutting edge browsers. “Only average?!” I hear you cry: The tests I refer to have all the browsers swapping places because they’re all optimised for certain types of work. The browsers optimise for certain types of work too, so some excel at some tests. Opera sits in the middle every time. This tells us that Opera is a mature and capable browser that beats every other browser in at least some performance related way. It’s very fast and capable of beating Chrome, Firefox and Internet Explorer 9 in some areas.

Opera displays websites flawlessly as much as any browser does. Infact, I’ve had less issues with Opera than I have with Chrome, and I’m sure that’s partly due to Opera’s diligence in ensuring this – they’ve gone to the extraordinary step of including fixes for some badly written popular websites. It’s as accurate as any browser.

It’s secure. If you read my blog, you’ll remember the Diginotar certificate issue, and that Opera was the only one who didn’t actually need updating. I’m never going to pull the security through obscurity card with proprietary software because I believe the opposite to be true, but one truth is exposure: Hackers target whatever will give it them the best results. Opera at 1% usage in the UK is not the target of anyone serious. Every application has weaknesses, Opera isn’t even being targeted.

I would pitch the developer tools against Firebug and Chrome any day. Have you ever included a script via jQuery’s .append() function? Opera is the only browser at the time of writing that’ll handle it correctly AND give you a hint to which file was included.

Opera:

  • It’s fast
  • It displays websites properly
  • It’s secure
  • It has arguably the best developer tools

http://www.opera.com/

As a sidenote, Opera also comes with a mail reader, a news reader, an IRC client, an interesting social-esque server and works on almost everything: Windows, Mac, Linux, Android, and probably that old phone you have shoved in a drawer somewhere.

* A rendering engine is what does the work. Internet Explorer is Trident. Firefox and Fennec (mobile) are Gecko. Chrome and Safari are Webkit**. Opera is Presto.

** Webkit is a KDE project for their browser Konqueror. Apple liked it, and repackaged it. Google liked Apple’s work and continued it. Chrome is Safari. Safari is Konqueror. It all started with the Linux KDE project. Remember that when you next load your iPad, iPhone, Safari, Chrome or any number of the plethora of Webkit derivatives. The respect is due for Linux’s KDE.

Google Ownership

So, I hear Larry Page has taken over management of Google from Eric Schmidt. I can only assume Sergey Brin can see what I see.

In my view, Google is an idealistic company. Page and Brin can see what’s good for humanity, in a vaguely socialist way, and want to push ahead with it. Schmidt has no doubt been the quiet voice in their ear saying “This isn’t a good idea…”.

You only need to look at PR disasters such as Street View and their privacy issues to see, a little more careful consideration isn’t a bad thing where Google is concerned. Let’s get something straight here for you readers thinking “Hey, they stole my Wi-Fi info”: If your network wasn’t encrypted, Google reading your data is the least of your problems.

I have a bad feeling this change in management will cause Google to make more badly informed decisions, and upset the general public even more. I’ve said before I’m a big fan of theirs, and believe in their “Don’t be Evil” policy, but this isn’t a case of “trust is earnt”. A company with the size and power of Google will be perceived to have done evil regardless of their intentions.

I can only pray Mr Page will put much more thought into his new product launches than the company as a whole has done until now. Without such consideration, they’ll end up in the newspapers every other day with some new sensational issue.

I wish you luck Mr Page, but I don’t have confidence.

EU vs Google: Anti-Trust Probe

This post relates to the article found at the BBC website, and doubtlessly thousands of other places.

I openly admit, I fall into the Pro-Google group. I believe Brin & Page do ultimately have our best interests at heart, if they are a little naive sometimes. I use Google for search, I use iGoogle for my homepage and I own a Google Android phone. I use Google for adverts (you know, that thing on the bottom right none of you ever click on). I also appreciate their motto “Don’t be Evil” cannot be an absolute.

The anti-trust investigation focuses on Google’s search results, and alleges that the results are manipulated in Google’s favour. I personally think this is unlikely, as it goes against the core principles of Google and their founders. As a point of interest, I searched Google for the term “search engine”. Here’s the results in order:

  1. Dogpile Web Search
  2. Wikipedia entry on Search Engines
  3. Microsoft Bing Search Engine
  4. Altavista UK Search Engine
  5. Altavista International Search Engine
  6. Google
  7. Ask Search Engine
  8. Yahoo Search Engine

If anything, these results are biased against Google: They have by far the dominant position in worldwide usage, so should be higher up that list?

I’m going to watch this case with interest, but I strongly suspect the companies complaining will end up looking like sore losers.

The Rename Program

The Nerd Generation days: Back in 1999 I needed some software to rename thousands of files, and after spending hours scouring the internet for something suitable, I gave up. I already had 7 years experience in Turbo Pascal and Delphi by this time, so I decided to write my own. A fortnight later, I was done. I found the program so useful, I decided to upload it to a few websites and forgot about it for a year. Next time I checked, it had been included in a French free newspaper and had dozens of great reviews, references all over the internet. The software had a few bugs and a lot I wanted to improve upon, but the web was fast becoming the way to go. I abandoned the project, the domain it was hosted on (nerdgeneration.com) and moved on.

Fast forward 11 years: Links and references to the software are still spread around the internet. The domain has been taken and abused by some cyber-squatters.

Partly due to the overhaul of Lamped, and somewhat due to the level of existing links and references to the software, I’ve decided to re-host it.

Please bear in mind, this software is 11 years old. It’s completely unsupported. It may or may not work in Vista or 7; or XP as part of a Windows Domain. In short: This blog entry mainly exists to help old users find it again.

Download it here: http://marcgray.co.uk/files/therenameprogram104.exe

I’ll be monitoring the downloads. If there’s still interest after all this time, I may rewrite it from scratch. Comments to this blog will help me decide.

Browser Versions

Every couple of months, I have a look at StatCounter’s Global Stats page, it’s probably one of the most balanced internet usage statistics services in the world, as their counter widgets are installed on a wide variety of websites. Most statistics services are based on access to specific servers, like the W3C’s one. I have a great deal of respect for the W3C, but I don’t believe statistics based on users visiting a technical web developer site is balanced.

As a web developer, browser and operating system usage trends are of great importance to me, though often depressing. I feel I need to know what browsers are most relevant for me to test sites against (if only more people did this, the internet would be a better place…). This philosophy has brought up a few questions about geography and intended demographics.

If the site is of a technical nature, should I instead focus on statistics from the W3C site? Should this blog be better optimised for those the W3C suggests? If so, perhaps I should consider Linux fonts and some more obscure browsers.

Does a shopping site primarily designed for European delivery really care what browsers the Asian market are using? In Europe, Internet Explorer 6 had 3.32% of the usage share in September 2010, but in Asia it has 16.26%!

The best solution, of course, is to optimise every site for every browser, but there comes a point where hours (or even days) of extra work would be done for a browser no one will ever use. Time is money, and money comes from the clients. I feel it’s my responsibility to advise clients as best I can to save them money. I guess it comes down to percentages. Do I or my clients care about a browser with 5% usage share? How about 3%? 2%? Where is the line drawn?

The situation is even more complex when you consider your clients needs in more detail. I’ve recently completed a website for a client who owns an iPad and an iPhone – these devices occupy a marginal share of browser use, but in this case special care was needed to ensure the site worked flawlessly on both. On a similar note, though most people begrudge fixing websites for Internet Explorer 6 or 7, if the client is a business that still has either of these browsers installed on their workstations, the fixes and optimisations suddenly become a lot more important.

A lot of my musings on this matter are courtesy of Internet Explorer. I desperately want to drop IE6 from my testing cycle, but I can’t. I kept telling myself “if it’s under 5% share, who cares?”, but such a cavalier attitude doesn’t make clients happy.

I think it boils down to: Write it for Firefox, fix it for Internet Explorer 8, test it in Chrome, Safari and Opera… Then apply whatever fixes are necessary for Internet Explorer 6 and 7 in a separate stylesheet. Lately this approach has been working well – sites work in everything, but don’t have (as a previous client once put it) “Razzle Dazzle” for the lower share browsers.

Crackers

Let me define a few terms in the IT world:

Hackers: Used to be known as the seedy, dodgy guys in films who break PCs. Nowadays hackers are the elite programmers who can turn Lead into Gold and such.

Crackers: Always considered “bad”, and now taken on the original definition of hacker. Crackers break things for their own purposes (spam, taking down servers etc).

I’ve had a run-in with some crackers lately which is starting to annoy me. As a server administrator, I wish I could do more about it. PCs in Saudi Arabia, Korea and China have been doing their best to take over my server through SSH, and in separate attempts, turn my server into a spam “bot”.

The widespread abuse of PCs and servers worldwide is becoming a serious issue. I have enough experience to keep these guys out (though I should have changed my SSH port before today…) and haven’t been seriously affected by it, but what about the other guys?

I’m a strong believer in security above all else, but I’ve been somewhat upset recently by a large company not using basic security consistently in their own card processing example code.

I think the point I’m making is twofold:

1. People need to consider security more carefully. You never know what will happen or when. Change default SSH and Remote Desktop ports. Sanitise all your data. Use passwords that no one could possibly ever guess. Use blacklists and blacklist data providers. Implement brute force limitations.

2. Governments need to consider cracking and PC abuse more seriously. If someone breaks into 1000 houses and steals a little money from each, they’d be in prison. If you steal 1000 bank records and commit minor fraud on each, you stand a fair chance of getting away with it. Hell, if you’re in the right country, no one will care.

The world needs to sit down and enforce specific laws around cracking, proxy servers without sufficient logging and infact any service that doesn’t maintain reasonable IP logging. Server owners worldwide need to be held responsible for continuously allowing (willingly or through negligence) cracking, unlawful or illegal activities on their systems. If you’re not logging it and willing to pass this on to the relevant authorities as required, you should be held partially responsible.

You do something illegal on my server, I’m rollin’ over on ya.