PR 10 website List
1. W3.org
2. USA.gov
3. Adobe Flash Player
4. Adobe Reader
5. Hhs.gov
6. Europeana.eu
7. miibeian.gov.cn
8. addthis.com
9. go8.edu.au
10. whitehouse.gov
11. facebook.com
12. cnn.com
13. google.com
With over 15 years of digital marketing experience, I specialize in creating tailored strategies that elevate businesses' online presence and drive measurable results. My expertise spans SEO, SEM, social media marketing, content creation, and analytics, ensuring a comprehensive approach that aligns with your unique goals. I’ve successfully boosted visibility, engagement, and sales for various clients, leveraging data-driven techniques and staying ahead of industry trends.
It delivers the richest library of visitor statistics in the industry, and has a high intuitive and easy to use interface. The client application is built as a framework for expansion, complete with an open API, plug-in capability, and a wide range of additional features.
The most popular free web analytics tool available today. It is extremely useful and comprehensive, which explains the popularity!
This tool shows real time statistics and has its own version for the Iphone.
Here we have a downloadable open source web analytics. It provides you with detailed reports on your website visitors (e.g. the search engines and keywords they used, the language they speak, the most popular pages).
Yahoo provides a free full featured enterprise analytics, which now gives insights into the demographics and category interests of your website visitors. There are also campaign management features to help you understand each audience and align with their needs.
Wordpress presents its users basic statistics and graphs – an easy-to-use tool for basic analysis.
A free version with basic but quick and useful information about your site.
This platform combines statistical information including visitors, referring URLs and even search engine traffic. It also provides a lot of information for search engine optimization by graphing keyword positions over time.
It is an invisible tracker that will count your blog visits and display other statistics. The tool is extremely easy to use and very useful for Bloggers who only need basic information about their blog performance.
Real-time statistics is presented and this tool also alerts you when something “different” is happening on your site. Like Google Analytics, you can see the number of clicks every link in your website gets. Correct dead spots and improve your site’s traffic flow.
So now you have 10 tools that are made just for analyzing your website performance. Let us know if you can add to the list!
SEO starts with keywords. And if you’re planning to market your site in the search engines, you should know what keywords you want to rank for before you even start building the site. Make sure this is done FIRST.
Here are some other posts that talk about how to properly do keyword research:
It amazes me how many sites load with and without the www in the URL. The problem with this is that it creates an automatic duplicate of your site, and can waste a lot of link value as people link to both versions. Decide which version of your URLs you want to use, then 301 redirect everything else to the preferred version.
Dynamic URLs can cause a lot of problems if not handled right. So rather than going through all of the headache that they cause us SEO-types, just set your site up with good URL rewrites so that you don’t have dynamic URLs in the first place.
More posts about URLs:
Even if a product or service can be found multiple ways on the site, make sure that there is only one unique URL for each product or service your company offers. This helps to eliminate unnecessary duplicate content problems.
You never know when you’re going to want to take a page down and redirect it to something else. The mistake a lot of sites make is that they just take a page down when they don’t need it any more. When this happens you lose the link value that page may have gained while it was live. So do yourself a favor: make sure you can 301 redirect that old page to a new page that can use the juice.
Having a custom 404 page makes it so that if someone lands on a 404 page, they at least know they’ve reached the right site. Without a custom 404 in place, they may just assume the site is down and move on to your competitor’s site.
Here is an example of a custom 404 page:
Alt attributes are very easy to overlook. But if you use them the right they can be another signal to the search engines to tell them what a page is about. One quick tip on this one: don’t abuse this attribute by using a keyword phrase on every single bullet point image or stuffing a bunch of keywords into the attribute.
Sometimes designers and developers get carried away with the look and feel of the page and forget to include room for text-based content. That’s what the search engines read, so you have to make sure there is a logical place for that content. Ideally, plan on having at least 150-200 words of optimized content on any page you want to rank well.
You should also make sure that your content is structured right. Have one H1 tag at the top of the main content, and then break out other sub topics with H2-H6 tags as appropriate. Make sure to use your keywords in these headings and in the content, but once again don’t overdo it.
I think that internal linking is one of the most commonly overlooked things for most sites. In fact, Ken Lyons wrote a great post about it that goes into more detail than I can in this post: Want More Link Juice? Here’s an Easy Way to Get It
A site should use the same title structure throughout the site. Pick your convention and stick with it. A good format to follow is to have a phrase that includes main keywords for the page and describes what the page is about, followed by a separator (- or | are common), and then your brand name. For example, “Professional SEO Services for Organic Website Optimization | SEO.com”. Keep these titles to under 65-70 characters so they don’t get truncated in the search results.
Since most of the search engines can choose to use your meta description as your snippet in the search results, you should have a unique one written for every page. Include the main keywords and a call to action to encourage clicks. DON’T just make this tag a list of keywords.
If you’re using any kind of tracking codes or other things on your site that create duplicate URLs, you’re going to want to be able to include a canonical tag on those pages. Also, depending on how your site is built you may need to include other meta tags like a robots tag and others. Make sure your site’s back end allows for this when necessary.
In case you missed it, social media is a pretty big thing right now. I’m not a big fan of the generic ShareThis button, but you need to have some kind of social media sharing buttons on your products and other important pages. Do some research to decide which social networks are best for your site and then stick with those.
More general information about social media:
If you don’t have any kind of analytics tracking installed, you have no way to tell where you traffic is coming from, what’s working, and lots of other crucial information. Pick a solution and get it installed. Popular ones include:
Make sure that the software you go with will allow you to block your office IP address, track conversions, ecommerce revenues generated through different online sources, and anything else that will help you to understand what is actually affecting your bottom line.
Through Google Webmaster Tools you can find out a lot about how Google sees your site, and can give them indications on how to handle certain parameters, submit your XML sitemap, and be notified of problems they find with your site. Bing’s Webmaster Center is coming along, so it’s worth it to go ahead and verify that one as well.
The more you follow standards, the easier it will be for someone else to come along later and make changes or modify the site. It’s a real problem when a site’s backend code or database is so complex that it has to be rebuilt later in order for it to be changed.
It only takes a few minutes to do it, but once the site is live make sure you create and XML sitemap and submit it to the major search engines through their webmaster tools accounts. It’s even better if you can set this up so that it automatically updates and pings the search engines whenever a change is made.
When you create your robots.txt file make sure that you are disallowing any pages or directories that you don’t want the search engines crawling. Standard examples would be login pages, search results pages, and shopping cart pages. You should also include a link to your XML sitemap as well. Also, make sure you test this file in your Google Webmaster Tools account to make sure it is working correctly.
Here’s a great site that talks more in detail about how to create a robots.txt file: About /robots.txt
If you follow these 18 guidelines you’ll launch a site that is in great shape as far as SEO is concerned. If you’re an SEO, feel free to add anything else to this list in the comments.