Posts

  • www. is NOT deprecated

  • Update: This was an opinion piece written back in 2005 to provoke discussion from the other side of the argument. Read the comments for the most up-to-date information and feel free to add your opinion below.

    Having no www. is wrong. Why?

    • Having www helps clarify the purpose of the url. ie: www is world wide web, webmail is for web based email, etc
    • Using the www. subdomain allows your website to be more versatile in terms of DNS and changes of IP address.
    • The major websites still use the www. prefix. See http://www.google.com/search?q=www
    • www has been used since the dawn of the world wide web why would we stop using it now?
    • example.com is often not detected as a URL, however www.example.com is.
    • When communicating URLs its shorter, easier and quicker to use www.example.com vs. http://example.com/
    • Removing the www. would not make sense, it would make more sense to change the prefix subdomain to web. allowing this to be the new standard.
    • People still use and want the www. prefix the same way they would use the ftp. prefix or wap. prefix.
    • Using the www. prefix helps readers of printed advertisements (tv, newspapers, flyers etc) define what it is.
    • Domains that use common TLDs (.com .net .org .co.uk) can be identified as domains, however domains with unusual TLDs may not be identified (eg: example.be, example.to, example.at, example.tv, etc).
    • Sometimes the content of example.com is different than the content of www.example.com
    • Name Server hosts (eg: ns1.example.com, ns2.example.com) must be unique, must have at least two, and must have different IPs. So why shouldn’t web hosts?

    Why can’t we just use both?

    You can use both as long as you redirect your traffic from one to the other (eg: example.com redirects to www.example.com). Why?

    Well that is simple. It affects the way your website is optimised.

    As search engines see a subdomain as a separate site if you run your website on www.example.com and example.com it effectivly means you have two copies of the same website. This means you are taking the focus from one website to two, if you wish to build up traffic it is best to make things as simple as possible for your visitors. Let them know that there is only one URL for your website, as this will considerably help you when people add backlinks to their website, or bookmark your website. It will also help with how search engines index and rank your website.

    Its quite important that when you start developing a website you decide whether your going to direct your visitors to www.example.com or example.com as you will need to focus on one.

    Redirection

    These redirection methods are used to redirect your domain (eg: example.com) or wildcard subdomain (ie: *.example.com) to www.example.com.

    mod_rewrite method:

    Add this to your .htaccess file (providing mod_rewrite modual is installed in your apache). (See Apache mod_rewrite)

    php method:

    Place this at the top of your index.php in your php code.

    if ($_SERVER["HTTP_HOST"] != "www.example.com") { header("Location: http://www.example.com/"); die; }

    Hostnames, protocols and examples

    • ftp.example.com – Used in conjunction with the ftp:// protocol (See rfc0959)
    • irc.example.com – Used in conjunction with the irc:// protocol (See rfc1459)
    • mail.example.com – The mail protocol is different, a mail server’s IP must resolve to the mail server’s host. (See rfc2821)
    • ns?.example.com – This forms the construct of DNS, you may NOT use the domain name as a Name Server. (See rfc2782)

    As you can see in all these examples, as long as example.com is pointing to the same IP as the host, you could use the domain name instead of one of the hosts above.

    The Spamcalc website explains the original concept and usage of Subdomains/Hostnames. It explains that the original purpose of subdomains is to show a hierarchical (eg: computername.subdomain.domain.topleveldomain).

    This makes sense when looking at the above protocols and services. As originally, and still today depending on demands, each service is run on a seperate machine (or seperate machines) as an attempt to dispurse heavy usage loads. However unless you run a large international network and have many thousands of users chances are you do not need this kind of structure. It is also apparent that as computers are getting faster and faster they are more than capable of running all these services on one machine.

    As your website/services expand, it makes sense that you might need to shift your services onto different machines, to achieve this each machine will need a different IP address. Therefore they will require a different host, as both servers cannot use the domain name, they will need to be assigned the specific hostname, which will ultimately depend on what service they will be running.

    Although it sounds complicated, its actually very simple, which is why it makes sense to use the prefix.

    Arguments for the other way

    Apart from the points made at no-www.org the other reasons why www. has deprecated…

    • Browsers will understand a domain name is a website and add the http:// protocol to reach the website.
    • The “World Wide Web” phrase has become somewhat outdated, and is now simply known as “The Web” or web.
    • The youth of today can identify that a domain (eg: example.com) is a website.
    • You don’t send email to [email protected]

    Resources

    • www.yes-www.org – An odd website for the www prefix
    • no-www.org – A website against the www prefix
    • no-www – Wikipedia entry
    • rfc2606 – Explains the usage of example.com throughout this document

    no-www.htm

  • Remove Spyware

  • Sorry, this article had become outdated and was removed.

    If you still need help you should try here:

    Still unsure? Contact an IT expert.

    fix.htm malware.htm 1137727121.html

  • Txt Spk iz Gr8

  • From a very early age I have been taught to write the English language correctly, I try and carry this trend over when I'm on the internet.

    However, over the years a new language has developed based on English, which I’m sure everyone is familiar with these days, it’s commonly known as SMS language.

    It appears that it originally came from bulletin boards and chatrooms particularly the ones that are popular on the AOL network and is, therefore, sometimes called AOL speak.

    The main reason why this language came about over the years is that kids began sending text messages to each other via their mobile phones, however the problem lies when they wish to send a message longer than 160 characters, this would span over two or more messages, meaning more costs. As you can probably imagine, kids soon got the idea that you could type a message quicker by using this language and keep costs down by keeping your message within the 160 character limit.

    Although this is probably information you already know, recently, questions of whether using this type of language should be acceptable or not have come up.

    To begin with I agreed with the majority of people who said English is English, we should use it correctly. However I began to think about where English actually came from, and how evolution has turned English into the language it is today.

    I do agree that when people write articles, reports and such it would be probably be best if true English was used, however who are we to say that this new language is not acceptable in today’s modern world, after all it is us who created the need for it, and it is us who encouraged it.

    I think over the next few years we will be forced to accept and embrace this new language as part of every day life, as an extension of the English language, that is of course, unless it has already happened.

  • SEO Tips

  • Quick List of SEO Tips

    • You must get indexed by the search engines, most offer a method to submit your site, however its FAR better to simply get a linkback from another website. Let them find you naturally.
    • Keep the text between the title tags under 60 chrs – This will help you get the most out of your keywords [Google tails off titles on the serps at about 60 chrs]
    • Avoid using dynamic URLs (eg: http://www.example.com/index.php?id=123) use static looking links instead (eg: http://www.example.com/example-articles.html) — Also avoid PHP session IDs and unnecessary variables.
    • If your site has multiple (sub)domains, and they have the same content, point them to one domain.
    • Register your domains for longer than a year – spammers usually only buy domains for 1 or 2 years.
    • Larger, older sites are better, as they score high in search engines.
    • In search engines “the-word” usually registers as “the word”, so where possible use – (dash) instead of _ (underscore) in your URLs
    • Choose a domain that will be most relevant to you, either your name (product, company, business, etc) or descriptive (eg: search-engine-optimisation-example.com)
    • Dashes in URL (eg: http://www.search-engine-optimisation-example.com/the-home-page.html) — stick to no more than 4 in a URL.
    • Keep your URLs short, less than 100 chrs is best.
    • Unique description meta tags per page – less than 200 chrs, although search engines don’t rely on this, it is often displayed
    • Unique keyword meta tags per page – less than 10 words, the keywords must appear on the body text of the page, no repeats
    • Make use of the header tags (H1, H2, H3) these will help identify what the page is and push your page further to the top
    • Place your keywords in your body text near the top, try and use b (bold), strong (bold) or em (italics) on them.
    • By internet standards all images must have alt text – Describe the image using short relevant text.
    • Use keywords for your link anchor text instead of using “click here” or such.
    • External links per page should be a limited to 100 where possible.
    • Choose your domain name TLD extension carefully, .edu and .org appear to get higher status than .com due to spam.
    • Size of pages should not be larger than 100K, its preferred that you have lots of smaller pages.
    • Search engines like updated pages, such as news pages, it also like additional pages, make sure you also keep your old static pages.
    • Site consistency — although its not proven to be a factor of SEO, it does help usability.
    • Search engines like blog sites as they bare lots of content and many external links, showing trends
    • Use text instead of images where possible. If you have to use images, use alt text. – Search engines don’t read text on images.
    • Get listed in directories such as DMOZ and Yahoo, avoid link farms and FFAs.
    • Avoid using meta redirect to redirect users to other pages.
    • Avoid excessive cross-linking to sites hosted on the same IP range (C class).
    • DO NOT copy your content. Not only does this violate copyright laws, search engines penalise you.
    • Search engines find it difficult to get relevant text from Flash if any at all, avoid constructing your page with flash, unless its for presentational purposes.
    • Search engines don’t follow links in forms, image mapping, javascript links, frames, very well, if not, not at all.
    • Avoid putting your email address in text form on a page, as this will appear in search engines and generate unwanted emails
    • Avoid any “black hat” SEO methods, they don’t work in the long run, stick at the real methods and you’ll get far better results than risk your site getting banned.
    • Use correct HTML markup where possible, although this doesn’t make much of a difference to the search engines, it will help you avoid other problems.
    • Don’t use gateway pages or intro pages, they are useless and tend to disappear from search engines, not only that but the main page gets pushed down. — This is also bad for usability as visitors aren’t interested in your gateway, just your website.
    • Try and get as many backlinks from as many reliant websites as possible. Avoid bad sites. — Anchor text on your links is important here, make sure you use a relevant keyword.
    • Try and get people to stay on your site, avoid annoying the user and provide them with reasons to stick around.
    • Suggest the user bookmarks your page or create pages that would be bookmark worthy.
    • Use a stable web server, there is nothing worse than having your site down.

    Also see:

    For more help with SEO, visit #SEO at EFnet

    seo-tips.html 1143716510
  • Guide to regular expressions

  • A “regular expression” (or regex) essentially is the use of a pattern for string manipulation.

    Regular expressions are used in many scripting and programming languages as they are a very powerful tool, allowing you to match and replace strings based on the supplied pattern.

    Regular expressions aren’t always that simple, however there are things you can do to make it as easy as possible.

    Some people, when confronted with a problem, think “I know, I’ll use regular expressions.” Now they have two problems.

    Jamie Zawinski, in comp.lang.emacs

    • If you can use native string functions instead, use them. Regular expressions should be your last port of call.
    • Don’t be lazy, TRY.
    • Read about what regular expressions actually are on Wikipedia.
    • Don’t reinvent the wheel, you may find the solution is already out there, google for it or try the Regular Expression Library.
    • Read Mastering Regular Expressions by Jeffrey Friedl. This book is not only a story, but its also a good resource for examples and reference.
    • Testing regular expressions?
      • The Regex Coach is a tool that comes highly recommended for testing your patterns.
      • Try the Regex Powertoy, it’s a fantastic AJAX based matching tool.
      • RegExR – a FREE desktop application for Mac, Linux and Windows.
      • RegexBuddy – Not free, but meant to be good.
      • For Python there’s Kodos.
      • jsregex.com
      • regexpal.com
      • Your favourite scripting language. PHP, Perl, Javascript, TCL, MRC, VBS, almost any of them!
    • Cheat Sheets make good references, there is one at RegExLib.com and ilovejackdaniels.com.
    • Regex in Javascript?
    • In PHP PCRE regex is quicker than EREG, but you should always see if you can use quicker native functions such as strncasecmp, strpbrk and stripos instead.
    • For performance reasons it is recommended that you should always try and use the native functions instead of regex, especially in PHP.
    • Stop parsing HTML with regexes and read how to parse HTML.
    • When parsing XML in PHP try xml2array, which makes use of the PHP XML functions, for HTML you can try PHP’s DOM document or DOM XML in PHP4. Also try Simple HTML DOM Parser.
    • Try getting help, if you do, tell us:
      • What goes in?
      • What do you want out?
      • Which language are you using?
      • Have you tried anything yet?

    Hope this helps!

subscribe via RSS