Sunday, March 21, 2010

Why we need HTTP Compression


HTTP compression isn't something you put in your code. Instead, check with your sys admin to see if it's installed (or can be) on the server that's dishing out your site's pages.
What does it do? HTTP compression accelerates the transmission of pages from server to surfer. It allows for server-side HTML compression so server apps (like Apache or Microsoft Web Server) can compress the source code of your page before sending it out over the wires.

HTML compression works on almost every browser these days, but the code is savvy enough to dish out un-compressed files to unsupported browsers. Compression alone improves the download size of pages by up to 300%. (Exclamation mark.)
Other nuggets in the HTTP compression nougat include persistent connections (between server and client) and pipelining (which allows the server to rattle off files without waiting for a client's "uh-huh, got that one, next please" response). Crackin' good stuff, all of it. The only negative thing we can say about HTTP compression is that it's probably already installed on your server, seeing how it's well-adopted now (for obvious reasons). Still, if you suspect your hosting company is slow-with-the-program, it couldn't hurt to ask.



Link Prefetching

Link prefetching is a feature in some browsers which makes good thrift of a browser's idle time by downloading files that aren't on the current page but might be needed a page or two down the road. Don't worry - link prefetching doesn't slow pages down. The extra downloads don't kick in until after the current page has finished loading and the browser doesn't have any pressing matters to attend to.
If you had a photo gallery, for example, with Previous and Next navigation links by each picture, you could add a prefetch link tags pointing to both of those destinations. That way, while the user is staring at one amazing travel picture, two other pages are downloading in the background. If the user does click either the Previous or Next links, those pages will already have been downloaded, and the content will display instantaneously. More technically speaking, "Guy clicks the link, see, Bada-bing, Bada-boom, it's the next page, already. Aow!"
Link prefetching isn't automatic - it relies on you, the developer, to encode hints as to what files are likely to be needed next. (Browsers aren't prescient, but good webmasters understand a site's traffic flow.) You can encode prefetch hints in either the HTTP header or the page's HTML. There are a number of allowed variations on how to code a prefetch, but a simple HTML  link¢ tag with a relation type of "next" is small, easy, and our favorite. Like these:

Cache In

Network (or Proxy) Caching
We previously discussed how browser-side caches store commonly-used images on the users' hard drives, but it's important to note that similar caches exist all alongside the highways and byways of the internet network. These "Network Caches" make websites appear more responsive because information doesn't have to travel nearly as far to reach the user's computer.
Some webmasters are leery of network caches. They worry that remote caches might serve out-of-date versions of their site - an understandable concern, especially for sites like blogs that update frequently. But even with a constantly-updating site, there are images and other pieces of content which don't change all that often. Said content would download a lot faster from a nearby network cache than it would from your server.
Thankfully, you can get a site "dialed in" pretty nicely with just a basic knowledge of cache-controls. You can force certain elements to get cached days on end while keeping other elements from being stored at all. META tags in your document won't cut it. You'll need to configure some HTTP settings to make caching really work for you.
Every Bit Counts
Alrighty. So you've done the big stuff - dropped bit depths on your every PNG, cranked up the HTTP compression, and taken a (metaphorical) weedwhacker to your old, convoluted table layouts. Yet you're still obsessed with how small, how fast, and how modem-user-friendly you can make your site. Ready to jump into some seriously obsessive-complusive optimization?
You know those TV commercials where they zoom in on a supposedly "clean" kitchen counter, only to reveal wee anthropomorphic germ-creatures at play?
Well, you can similarly clean every extraneous detail from a site's layout, and still have some nasty, nasty cruft living in the source code. What's the point of novel-length meta keyword lists and content tags? C'mon, do you still believe that search engines care about that stuff? Not in this millenium. You'll get better search referrals by thinking carefully about the real content on your pages and building an authoritative site that's linked to widely.
Streamlining the  head¢er section of unneeded meta keyword/author/description content, and likewise junking giant scripts makes a bigger impact, kilobyte-per-kilobyte, than sacrifices made elsewhere on the page. Having a short  head to ¢ to your document ensures the initial chunks of data the user receives contain some "real" content, which gets displayed immediately. That's another notch for "perceived speed" improvements.
Of course, there are plenty of regular  body¢ bytes still worth tossing. Start with HTML comments, redundant white space, and returns. Stripping all these invisible items from your source code yields extra kilobytes of space savings on the average.


URL Abbreviation

Ever spot how links on the Yahoo frontdoor are generally just a few characters long? Go to the site and move your mouse over some of the news links near the top. You'll see they all start with http://yahoo.com/s/ and then list a string of six numbers. Links, generally, run on much longer than that, especially if they include redirect codes or CGI variables. Put enough normal links on a page, (viz. the Yahoo frontdoor, again) and a sprinkling of kilobytes (and seconds of download time) is also added to the code.
So what's Yahoo doing with those funny links, anyway? They're abbreviating their URLs, using the mod_rewrite Apache module, so that a link like "/s/882142" redirects to "mysite.com/content/unregistered/News". Implementing this requires getting your hands dirty with some server configuration. Specifically, you need to get mod_rewrite installed and poke around with the srm.conf file. Dirty work for many of us, but the payoff is worth several solid kilobytes on a link-heavy page.



Monday, March 1, 2010

Keyword Effectiveness Index

The Keyword Effectiveness Index (KEI) compares the Count result (number of times a keyword has appeared in  data) with the number of competing web pages to pinpoint exactly which keywords are most effective for your campaign.
In a nutshell: Look for the keywords near the top. The higher the KEI, the more popular your keywords are, and the less competition they have. Which means you have a better chance of getting to the top.
The article below is a much more detailed look at the KEI and why we have decided to use it.


DETAILED EXPLANATION

The KEI is a measure of how effective a keyword is for your web site. The derivation of the formula for KEI is based on three axioms:

1) The KEI for a keyword should increase if its popularity increases. Popularity is defined as the number of hits coming. This axiom is self-explanatory.

2) The KEI for a keyword should decrease if it becomes more competitive. Competitiveness is defined as the number of sites which a search engine e.g. AltaVista displays when you search for that keyword using exact match search.

Exact match search means that a search engine searches for only those sites, which use the keyword exactly as typed in by the user. It is the equivalent of entering: It Basically used by " "

“beach wedding dresses”


Partial match search means that a search engine also searches for sites which contain the individual words of the keyword but not necessarily occurring together or in the order typed in by the user. It is the equivalent of entering:
beach wedding dresses


Partial match search presents a distorted picture of the competitiveness of a keyword because when you optimize your site for a particular keyword, you are actually competing with sites which have used the keyword exactly as typed in by the user.

So to clarify, competitiveness is defined as the number of sites which a search engine displays when you search for that keyword using exact match search, that is with quotes surrounding the term. Rather than those web sites returned when entering the phrase only partially, that is without quotes.

Note: When you select KEI Analysis, quotes will be added temporarily to each of your search terms for the purposes of the search.


3) If a keyword becomes more popular and more competitive at the same time such that the ratio between its popularity and competitiveness remains the same, its KEI should increase. The rationale behind this axiom requires a more detailed explanation. The best way to do this is to take an example:
Suppose the popularity of a keyword is 4 and AltaVista displays 100 sites for that keyword. Then the ratio between popularity and competitiveness for that keyword is 4/100 = 0.04.
Suppose that both the popularity and the competitiveness of the keyword increases. Assume that the popularity increases to 40 and AltaVista now displays 1000 sites for that keyword. Then the ratio between popularity and competitiveness for that keyword is 40/1000 = 0.04.

Hence, the keyword has the same ratio between popularity and competitiveness as before. However, as is obvious, the keyword would be far more attractive in the second case. If the popularity is only 4, there's hardly any point in spending time trying to optimize your site for it even though you have a bigger chance of ending up in the top 30 since there are only 100 sites which are competing for a top 30 position. Each hit is no doubt important, but from a cost-benefit angle, the keyword is hardly a good choice. However, when the popularity increases to 40, the keyword becomes more attractive even though its competitiveness increases. Although it is now that much more difficult to get a top 30 ranking, spending time in trying to do so is worthwhile from the cost benefit viewpoint.
A good KEI must satisfy all the 3 axioms. Let P denote the popularity of the keyword and C the competitiveness.
The formula that we have chosen is KEI = (P^2/C), i.e. KEI is the square of the popularity of the keyword and divided by its competitiveness. This formula satisfies all the 3 axioms:

i) If P increases, P^2 increases and hence KEI increases. Hence, Axiom 1 is satisfied.
ii) If C increases, KEI decreases and hence, Axiom 2 is satisfied.
iii) If P and C both increase such that P/C is the same as before, KEI increases since KEI can be written as
KEI = (P^2/C) = (P/C * P). Since P/C remains the same, and P increases, KEI must increase. Hence, Axiom 3 is satisfied.
Note that the formula for KEI is not unique. In fact, this is one of the nice things about the KEI. If, instead of using 2, you use any power of P greater than 1, the resultant formula will also satisfy the 3 axioms. For example, (P^1.5/C) and (P^3/C) both satisfy the 3 axioms. The exact power of P that you choose depends on how much emphasis you want to give to the popularity of a keyword viz-a-viz its competitiveness. Higher the power of P in the formula, higher will be the emphasis on popularity. If you are very confident about your search engine positioning skills, choose a higher value for the power of P. If you are not that confident about your search engine positioning skills, choose a lower value for the power of P (but the power should still be more than 1). Thus, the KEI can be adapted to your skill level! Feeling confused as to which power you should choose? Stick to 2. It maintains a nice balance between both popularity and competitiveness.
 
rantop.com
....Our Business Partners....

Rainrays Web Directory


Earn upto Rs. 9,000 pm checking Emails. Join now!