Pages

Advertisement (468 x 60px )

Tampilkan postingan dengan label SERP. Tampilkan semua postingan
Tampilkan postingan dengan label SERP. Tampilkan semua postingan

Bing Rangking Up!


Ranking In Bing: What You Need To Know
Google is important but they aren’t the only player out there. According to recent numbers, Microsoft’s Bing now generates 30% of all U.S.A. searches (half Yahoo – which is Bing-powered and half from the Bing.com web site.)
It’s becoming more important that you be sure your site is optimized for Bing.
So what does Bing look for?  Here’s a cheat sheet you can use to optimize your site for Bing.

Links:

  • Bing likes “editorial links” which means links that come from within body content and not just a list of links on a resources page or in a directory.  They do place some value on all links, but the links that help you the most will be the “editorial” style ones.
  • They like to see some relationship between your site and the site linking to it (relevance).
  • They like links from authority sites. (Duh!)

Domain Names:

  • Bing likes domain names that use the keywords.   Ex: if your site is about classic cars, go for a domain name that uses that phrase.  Don’t stuff lots of keywords in your domain name.  Go with the primary keyword phrase for your site.
  • Bing likes older domain names so now is not the time to give up a domain name with good history and start over.

Content

  • Bing tends to favor lots of content.  Smaller sites don’t do as well.
  • They seem to place a higher value on pages that have at least 300 words.
  • Use keywords in your body text.  Don’t overdo it though!
  • Create strong themes on each page by only covering one topic per page.
  • Don’t plagiarize – always either use your own unique content or properly credit the source of the content you are posting (but even with proper credit, you don’t want any engines to view your site as having a lot of duplicate content).
Side note: Here’s Bing’s take on duplicate (redundant) content:  “Basically, redundant content is recognized by the search engine during indexing, and often redundancies are eliminated from the index, or at least they are removed from the search engine results pages (SERPs) for relevant queries.”
  • I’m going to state the obvious here: they don’t like hidden text.
  • Bing really likes to see the phrases exactly as people would search for them used on the page (but c’mon people, be sensible, this isn’t license to create a bunch of gibberish pages with phrases that don’t flow well and that are filled with typos)

Site Structure/Code

  • Make sure your pages are a reasonable size.  They suggest keeping pages with no images under 150 KB.
  • Don’t put important text in images, script or Flash.  They want the “meat” of the page served as plain text.
Bing’s comments on Flash: “If your site uses animation technology, such as Silverlight or Flash, there’s good news and bad news. First the good news: bots are getting better and better at extrapolating text content from these sophisticated presentation technologies. But now for the bad news: it’s still a hit-or-miss game (frankly more miss than hit), and the use of these technologies is ultimately not a good bet for SEOs.”
  • Be very organized in your site structure – they recommend keeping your page “fairly flat” – which they further explain “That is, each webpage should only be from one to three clicks away from the default webpage.”
  • Bing suggests using a ‘broad-to-specific” flow – start with a general overview on home and then funnel into topic specific pages.  In Bings words: “introduce your content theme, provide basic overview information, and present the information navigation scheme for the site. Thinking in terms of building an organizational chart of content will help you “bucketizeyour content into logical groupings and landing pages.
They go on to say “…keep the content organization closer to the surface. You don’t need to plunge deep as you get into detail. Instead of going vertical, expand your content horizontally. Stay shallow, using many first and second level directories instead of burying your content in deep silos. This pattern of information flow will help users more easily find what they want to see and help the search engine bot crawl the information on your site.”
  • Each page should be accessible by at least one static text link.

On-Page Optimization

  • Add a Sitemap and register at Webmaster Tools.
  • Use Title tags, Description tags, Header Tags (use them in hierarchal order) and use the <strong> tag to emphasize keywords in the body content.
  • Rules for Header (<h1>, <h2> etc) tags: They suggest you only use one <h1> tag per page.  You won’t be punished if you use more but you diminish the value of all of them if you use more than one.  Use the one to establish the main topic for the page, with the primary keyword phrase.  You can use <h2> and <h3> deeper in the page for your sub-heads and to further support the main topic.  Don’t exceed 150-200 characters in your Header tags (in other words don’t stuff large chunks of content into your Header tags)
  • Bing places a lot of emphasis on Title tags, so be sure to use your primary keyword phrase for each page in the Title tag.
  • Make sure your robots.txt file isn’t blocking the bots or creating any other issues.
  • DESCRIPTIONS IN THE SERPs: Bing and Yahoo sometimes use the text in the first H1 header on the page to supplement or replace what you have in your Meta Title and/or Description tag.
Here is a great little table from Bing that shows the 4 parts of the results on the SERPs and where they pull the info from:
Component Primary Sources
Title <title> tag, Header <h> tags
Snippet Meta description tag, page content, description from DMOZ.org
URL Page URL
Preview Page content, extracted page data, commonly clicked links
Sample of a search result in the SERPs from Bing:
sample listing
  1. Title
  2. Description/Caption/Snippet
  3. URL
  4. This is the arrow that leads to the preview

Not Ranking Well In Bing?

Rick DeJarnette from Bing’s Webmaster blog has this to say:
“The reasons for poor ranking are nearly as myriad as the number of sites on the Web… And frankly, it could be that those competing sites have seriously invested in search engine optimization (SEO) efforts when you have not. If your page ranking is not where you want it to be, your site may be due for a serious examination, one that looks for problems to solve and uses optimization techniques that are meaningful to both human visitors and search engine bots alike. You may be ready to consider a site review.

Troubleshooting

Important to check out if you may be in trouble
So there ya have it.  Review your pages and see how you measure up.  Bing is increasing in important and getting ahead now is wise.
NOTE: All quotes from Bing come from their Blogs (linked from here: http://www.bing.com/community/b/default.aspx)

READ MORE - Bing Rangking Up!

Three Easy Steps for Efficiently SEO

Finding exactly what you’re looking for through Google can be a rather hectic experience. We tend to pick broad, random search terms hoping we’ll find something close to our objective. We then start clicking through the results, then click through the search suggestions, then the related searches, and on and on…until we forget where we started and are at a loss at how to proceed.
This post is aimed at offering a good actionable system through three definitive three steps to finding what you need:

Step 1. Scratch the Surface: Use the Most Obvious Search Term

In this three-step case study on how to best use Google for your research needs, I’ll be using my own real-life experience as an example. One day, I decided to write an article on how businesses use social media to change brand perception.
The first problem I encountered was finding the actual search term. I instinctively went with the most obvious one that first came to mind:
  • (change) brand perception
The problem with generic search terms is that they either yield too many non-content results (company pages and profiles), or they return results which are far removed from what you need.
In my case it was the latter – the returned results were either too general (on how to change brand perception) or contained too many bad examples. What I really needed was at least three strong case studies.
Obviously, you can also try to narrow your initial search query using some more specific words. In my case these were:
  • (change) brand perception case study
  • (change) brand perception example
  • (change) brand perception case study Twitter

What to pay attention to at this stage:
  • Possible terms that would pop up with your main term now and then – these would allow you to narrow down your search.
  • Possible sites that would be worth digging deeper into.
Tools to help:
  • Google Suggest
  • Search Cloudlet

Step 2. Narrow Your Search: Identify and Search within Huge Information Sources

Despite many improvements, Google still has difficulty sifting through a vast resource to identify the most relevant page that best suits your needs. It has a two page return limit from the same domain – potentially losing some great results.
My usual way to handle this issue is to identify 3-5 huge resources relevant to my initial query and use SITE – an advanced operator to dig through them deeper.
In my case these were:
  • site:mashable.com (change) brand perception
  • site:http://www.womma.org/casestudy/ (change) brand perception
  • site:socialmediaexaminer.com (change) brand perception
What to pay attention to at this stage:
  • Pay close attention to vocabulary – bloggers may use different words to describe what you have in mind.
Tools to help:
  • Check these browser addons that can make using SITE: operator and make searching with any site much easier.
  • Some clipping and note-taking tools can make going through content and collecting relevant alternate phrases for your search much easier.
The two steps above should have helped you identify other relevant search terms that fit your needs. You don’t need to discover dozens of them. As a rule, I manage to identify 2-3 additional terms that will be relevant to what I need.
In my case study these were the following search terms:
  • brand crisis management
  • redefine brand

READ MORE - Three Easy Steps for Efficiently SEO

SERPs for SEOs



We SEOs know that once the client signs the SEO Contract – The expectations are:
  • Page 1 Position 1 ranking on the search  engines and
  • Loads of target traffic with conversions taking place regularly.
But, we also know that when we start  working on an SEO campaign, we follow a certain procedure for on-page and off-page factors – and after  regular monitoring and measuring of the analytics, we try to achieve the following objectives step by step.
  • Maximize the visibility all over the web
  • Work on the website to making it more rich in content and error free HTML
  • Solve cannonical issues
  • Eliminate duplicate content issues
  • Add more meaning to the images of the  website
  • Work on many other technical aspects of the website like HTTP headers, 404 pages, 301 redirects etc.
  • Help establish an online brand
  • Help establish a good online reputation
The main goal is to achieve targeted traffic and we all know   a high CTR  is possible when  your website ranks on page 1 and preferably on first three positions.  So the SERPs are important. This post on high CTR  on Search Engine Watch proves this point with facts and figures.
But lately I have been reading a lot on SEO blogs that SERPs as a metric for ascertaining the success of the SEO campaigns are not important. But such a statement  made is misleading . Now don’t get me wrong.
Read Further….
I agree that rankings cannot be guaranteed, every site is different from the SEO specifics perspective and the time taken to achieve high rankings vary from site to site. A holistic approach is needed when we work on a SEO campaign and we should be talking about quality web presence if we advocate genuine SEO.
But  rankings as a factor for ascertaining the long term  success of an SEO campaign cannot be ruled out totally. At the end of the day it is the targeted traffic and conversions which generate online business .
So what should be the approach towards the SERPs as a metric ?
I repeat rankings cannot be guaranteed , SEOs can just guarantee their efforts on achieving them but results depend on umpteen number of factors. But, with experience I can say that if the SEO efforts are in the right direction and are purely organic  then gradually the possibility of attaining high rankings increases over time . Hence monitoring the rankings of the keywords in the long run is important. It might take anywhere between 1-2 years maybe more for certain highly competitive head terms but an eye has to be kept on the SERPs too.
Though the SERPs keep on changing and should not be considered as the primary performance measure of any SEO campaign but yes the higher you rank on organic listings the higher is your CTR and that can lead to increased business which is the primary goal for any business website. (The article on SEW proves that )
There has to be a strategy , we focus on long tail keywords and keywords having a local intent first and gradually see that the pages which are targeted for head terms also get the potential to rank as with time the site starts gaining the necessary natural links and also gets promoted on social media thereby improving the link graph and the social graph.
So SERPs though not stable should be mentioned in the monthly SEO reports  even if the site is on Page 18 for a certain keyword.
What is important to note is :
  • Does it improve over a period of time  or
  • Remains stable or
  • Is pushed from page 18 to page 80 due to an algo. Update.
As one of the goals of  SEO is to improvise the site to adapt itself to all these  algoquakes  and  keep the search engine presence live and rankings as high as possible.
So, SERPs are important. SEO does have a wider connotation but the basic goal of ranking high cannot be totally ruled out.The goal has not changed the approach towards achieving that goal is evolving.
In fact It would be a great help to all web marketers especially SEOs if the SERP ranks are also displayed in the Google Analytics reports next to the keywords.
I know that the SERPs keep on changing but it is an important reporting field for all SEO reports as the traffic does depend on how high you rank.
Having this in the GA reports will reduce our dependence on other third party softwares and we can just concentrate on Google webmaster tools and GA for analysis.
I suggest that there can be 2 fields for SERPs one for the SERP for the keyword when the click took place and one field showing the current SERP when the report is being accessed.

READ MORE - SERPs for SEOs

50+ Tools to Automate Your Link Building

Are you a human link builder? If so, ask yourself this: "if a robot link builder existed, what would I still be able to do that it could not?"
Analyze a complex backlink profile and distinguish quality links from spammy ones? Check. Write a funny personal email that gets someone's attention in the right way? Check. Decide when a phone call might be the best outreach method? Check.
And what could the robot do faster and better than you?
Find every link to a site? Check. Automatically search through SERPs and connect each result to external data? Check. Automatically search for contact information on three different pages and score how closely it matched a person's name? Check. Automatically pre-populate data fields in a CRM? Check.
If you've ever heard the phrase "build on your strengths," the lesson for link building is this: that we need to automate as much of the routine, "robot work" as possible, and spend more time doing what we're best at: being sentient human link builders.
In this post, we'll look at tools that can help link builders shift their workload to computers as much as humanly possible.
Backlink Data
Let's start with the most basic automation. You need tools to research sites' backlink profiles. These tools crawl the web and build a database of raw data about backlinks.
Each tool provides, at minimum, the ability to lookup a list of all the pages linking to a URL or domain, and some include detailed information about each link's anchor text, type (text or image), follow status, authority for the linking page, and in some cases the ability to group, sort, search, and filter the results.
  • Majestic SEO: A well-regarded index of link data with information about anchor text, authority, Class C IPs, and relevance, not to mention good sorting and filtering. My only complaint is that their pricing and user-interface is a bit confusing.

  • Open Site Explorer: A very user-friendly tool with anchor text data, follow status, and authority. The only downside is the index may miss some links in the "deep web."

  • Yahoo Site Explorer: Known for being relative fast to find new links (other indexes are updated monthly), but very limited because it can only return 1,000 links per page or domain and offers no "extra" data such as follow status or filtering capabilities. But it's free! Yahoo also offers an API (Yahoo BOSS), which according to many, is more current than the Site Explorer website.

  • Google: Yes, their "link:" operator leaves much to be desired, but just because it's incomplete doesn't mean it's useless.

  • Blekko: This new search engine offers tons of free backlink data available from a very deep index.
Site-Level Backlink Analysis
Many tools offer backlink reports at the site or URL level, but are limited to only the data points they have available. So then what do you do if you want to filter a site's backlinks down to only followed inbound links, with toolbar PageRank of at least 5, and no more than 50 outbound links?
Enter site-level backlink analysis tools. These tools gather traditional backlink data with a traditional set of backlink data, often pulled from one or multiple backlink data providers.
  • Link Diagnosis: Powered by Yahoo BOSS, Link Diagnosis uses a Firefox extension to pull up to 1,000 links per page and lookup metrics such as the toolbar PageRank of each URL, whether the link actually was found on the page, follow status, anchor text of each link, and aggregate level reporting.

  • BacklinkWatch!: Also powered by Yahoo, BacklinkWatch! pulls the first 1,000 links for a page (the most Yahoo will give up), and appends the number of outbound links on the source page along with any flags they find (nofollow, image links, etc.).

  • AnalyzeBacklinks: Simple and free tool that analyzes backlinks to a page and appends anchor text, total number of links, outbound links, title of the linking page. One feature I like is that ability to flag links that mention a keyword you've selected.

  • SEOBook Link Harvester: Shows backlinks grouped by linking domain, groups them by top level domain (TLD), and provides summary metrics about the number of incoming links and percentage of deep links to the page.

  • SEOBook Back Link Analyzer: A free downloadable tool that pulls backlink data from Google, MSN, and Yahoo, crawls the linking pages, and builds a table of information about each link including follow status, number of outbound links, page title, and more.

  • SearchStatus Plugin for Firefox: A free Firefox extension from iAcquire that pulls the backlinks from Google, Yahoo, and Bing.

  • SEOLink Analysis: Supplements lists of links produced by Google Webmaster Tools and Yahoo Site Explorer with information about each link's PageRank, anchor text, and follow status.

  • WhoLinksToMe: Produces various detailed backlink reports with views by link, anchor text, country, IP, and more. Many charts and graphs to aid in the analysis. Freemium.

  • Many enterprise SEO packages also offer data-rich site level backlink analysis, including BrightEdge, SecondStep, RankAbove's Drive, seoClarity, SEO Diver, SISTRIX Toolbox, and gShift Labs, and link building specific tools such as Advanced Link Manager, Linkdex and Cemper's LinkResearchTools offer powerful backlink analysis features.
SERP-Level Backlink Analysis
It seems like every link builder has a preferred set of data when it comes to competitive analysis. So don't expect any single tool to pull every conceivable piece of data and put them all into same columns you've always used.
You still may find yourself exporting data to Excel and merging with other data sources. But any time saved from manually copying and pasting data into spreadsheets (or hiring and managing people to do so) can be spent on more human, value-added activities.
Anyone without research tools for SERP analysis is at a competitive disadvantage.
Link Prospecting Tools
There are many ways to find link opportunities, and the tools listed below can only really scratch the surface when it comes to the universe of link opportunities that some creativity and insight can find.
  • Query generators are the original link automation tools. These tools take a keyword and automatically create dozens or hundreds of canned searches to find common link opportunity types (e.g. resource pages, guest posts, and directories). Here are a few of the most popular: SoloSEO, Ontolo's Link Query Generator, SEOBook's Link Suggest, BuzzStream's Link Building Query Generator (disclaimer: I co-founded BuzzStream), and Webconf's Backlink Builder. I would strongly caution anyone using a list of other people's queries to weed out queries that don't make sense for them -- don't just head down a link building path because a tool suggested you seek a link on every "inurl:links.html" page in your industry.

  • Link prospecting tools build upon the query generator idea, but automate the task of visiting each page in and compiling addition metrics (and in some cases, contact info). This can save link builders time by enabling them to prioritize prospects with the highest value. But you can't just take everything the tools give you. Plan to review each prospect to assess its appropriateness to your link building campaign. Here are a few of the most popular: Ontolo, Adgooroo's Link Insight, and Advanced Link Manager.

  • Cocitation or "hub finder" tools help you find sites that link to multiple competitors. Some look across all links to your competitors, and some analyze the top ranking sites for a given keyword. The best known offerings in this area are SEOBook's Hub Finder, Adgooroo's Link Insight, Raven's SiteFinder, LinkResearchTools, Linkdex, WordTracker Link Builder, SEODiver, and Shoemoney Tools.

  • Proprietary technique research tools use a combination of their own search queries and analysis rules to generate a list of screened, quality link prospect opportunities. Link Insight is known for integrating many of Eric Ward's (a.k.a., "Link Moses") research methods, though I wouldn't call it an Eric-SaaS just yet. Ontolo offers a number of proprietary searches, but also leaves a fair bit of detail and control in users' hands.

  • Checklist-driven link building tools give users bite-sized link building tasks, such as "Today you should request a link on DMOZ!" (except their suggestions tend to be more clever than that): LotusJump, Hubspot, DIYSEO, and SEOScheduler.
Next time, I'll cover tools that address contact research, link management (CRM), link outreach management, and link monitoring.
A note about some tools I won't cover: tools that scrape SERPs for sites and automatically extract email addresses and send blast mass emails, tools that automate directory submission, "article marketing," and blog commenting, services that automate blindly placing link-laden content on an unknown network of sites, or tools to automate reciprocal link exchanges. These tools exist and some people use them, but I have yet to find them to be beneficial.
The best strategy for using link building automation tools is to first develop a good process for tracking your link prospecting data and managing your outreach via a structured workflow. Once you have data and process in place, you can start automating some of your routine tasks.
The point of using great tools, whether it's an array of three 24" monitors on your desk, an Aeron chair, or fancy link building tools, is to eliminate energy wasted on low value activities, work in new ways, and free you up to focus on what you, the human link builder, is uniquely suited to do.
READ MORE - 50+ Tools to Automate Your Link Building