☯☼☯ SEO and Non-SEO (Science-Education-Omnilogy) Forum ☯☼☯
SEO => SEO => Topic started by: SEO on January 19, 2011, 04:59:34 PM
-
About seo history, science and so on. :)
-
The SEO's birthday. When it was "born" ?
-
Well, well, well...as far as I know the SEO's birthday is 1997. I'll post more about SEO's history soon.
Viva SEO ! ;D
-
As I promissed you, I'm going to post about SEO's stuff soon. Here we go-
Search engine optimization (SEO) is the process of improving the visibility of a website or a web page in search engines via the "natural" or un-paid ("organic" or "algorithmic") search results. Other forms of search engine marketing (SEM) target paid listings. In general, the earlier (or higher on the page), and more frequently a site appears in the search results list, the more visitors it will receive from the search engine. SEO may target different kinds of search, including image search, local search, video search and industry-specific vertical search engines. This gives a website web presence.
As an Internet marketing strategy, SEO considers how search engines work and what people search for. Optimizing a website may involve editing its content and HTML and associated coding to both increase its relevance to specific keywords and to remove barriers to the indexing activities of search engines. Promoting a site to increase the number of backlinks, or inbound links, is another SEO tactic.
The acronym "SEO" can refer to "search engine optimizers," a term adopted by an industry of consultants who carry out optimization projects on behalf of clients, and by employees who perform SEO services in-house. Search engine optimizers may offer SEO as a stand-alone service or as a part of a broader marketing campaign. Because effective SEO may require changes to the HTML source code of a site and site content, SEO tactics may be incorporated into website development and design. The term "search engine friendly" may be used to describe website designs, menus, content management systems, images, videos, shopping carts, and other elements that have been optimized for the purpose of search engine exposure.
Another class of techniques, known as black hat SEO or spamdexing, uses methods such as link farms, keyword stuffing and article spinning that degrade both the relevance of search results and the user-experience of search engines. Search engines look for sites that employ these techniques in order to remove them from their indices.
History
Webmasters and content providers began optimizing sites for search engines in the mid-1990s, as the first search engines were cataloging the early Web. Initially, all webmasters needed to do was submit the address of a page, or URL, to the various engines which would send a "spider" to "crawl" that page, extract links to other pages from it, and return information found on the page to be indexed. The process involves a search engine spider downloading a page and storing it on the search engine's own server, where a second program, known as an indexer, extracts various information about the page, such as the words it contains and where these are located, as well as any weight for specific words, and all links the page contains, which are then placed into a scheduler for crawling at a later date.
Site owners started to recognize the value of having their sites highly ranked and visible in search engine results, creating an opportunity for both white hat and black hat SEO practitioners. According to industry analyst Danny Sullivan, the phrase "search engine optimization" probably came into use in 1997. The first documented use of the term Search Engine Optimization was John Audette and his company Multimedia Marketing Group as documented by a web page from the MMG site from August, 1997 on the Internet Way Back machine (Document Number 19970801004204). The first registered USA Copyright of a website containing that phrase is by Bruce Clay effective March, 1997 (Document Registration Number TX0005001745, US Library of Congress Copyright Office).
Early versions of search algorithms relied on webmaster-provided information such as the keyword meta tag, or index files in engines like ALIWEB. Meta tags provide a guide to each page's content. Using meta data to index pages was found to be less than reliable, however, because the webmaster's choice of keywords in the meta tag could potentially be an inaccurate representation of the site's actual content. Inaccurate, incomplete, and inconsistent data in meta tags could and did cause pages to rank for irrelevant searches.Web content providers also manipulated a number of attributes within the HTML source of a page in an attempt to rank well in search engines.
By relying so much on factors such as keyword density which were exclusively within a webmaster's control, early search engines suffered from abuse and ranking manipulation. To provide better results to their users, search engines had to adapt to ensure their results pages showed the most relevant search results, rather than unrelated pages stuffed with numerous keywords by unscrupulous webmasters. Since the success and popularity of a search engine is determined by its ability to produce the most relevant results to any given search, allowing those results to be false would turn users to find other search sources. Search engines responded by developing more complex ranking algorithms, taking into account additional factors that were more difficult for webmasters to manipulate.
Graduate students at Stanford University, Larry Page and Sergey Brin, developed "backrub," a search engine that relied on a mathematical algorithm to rate the prominence of web pages. The number calculated by the algorithm, PageRank, is a function of the quantity and strength of inbound links. PageRank estimates the likelihood that a given page will be reached by a web user who randomly surfs the web, and follows links from one page to another. In effect, this means that some links are stronger than others, as a higher PageRank page is more likely to be reached by the random surfer.
Page and Brin founded Google in 1998. Google attracted a loyal following among the growing number of Internet users, who liked its simple design. Off-page factors (such as PageRank and hyperlink analysis) were considered as well as on-page factors (such as keyword frequency, meta tags, headings, links and site structure) to enable Google to avoid the kind of manipulation seen in search engines that only considered on-page factors for their rankings. Although PageRank was more difficult to game, webmasters had already developed link building tools and schemes to influence the Inktomi search engine, and these methods proved similarly applicable to gaming PageRank. Many sites focused on exchanging, buying, and selling links, often on a massive scale. Some of these schemes, or link farms, involved the creation of thousands of sites for the sole purpose of link spamming.
By 2004, search engines had incorporated a wide range of undisclosed factors in their ranking algorithms to reduce the impact of link manipulation. Google says it ranks sites using more than 200 different signals.The leading search engines, Google and Yahoo, do not disclose the algorithms they use to rank pages. Notable SEO service providers, such as Rand Fishkin, Barry Schwartz, Aaron Wall and Jill Whalen, have studied different approaches to search engine optimization, and have published their opinions in online forums and blogs. SEO practitioners may also study patents held by various search engines to gain insight into the algorithms.
In 2005 Google began personalizing search results for each user. Depending on their history of previous searches, Google crafted results for logged in users. In 2008, Bruce Clay said that "ranking is dead" because of personalized search. It would become meaningless to discuss how a website ranked, because its rank would potentially be different for each user and each search.
In 2007 Google announced a campaign against paid links that transfer PageRank. On June 15, 2009, Google disclosed that they had taken measures to mitigate the effects of PageRank sculpting by use of the nofollow attribute on links. Matt Cutts, a well-known software engineer at Google, announced that Google Bot would no longer treat nofollowed links in the same way, in order to prevent SEO service providers from using nofollow for PageRank sculpting. As a result of this change the usage of nofollow leads to evaporation of pagerank. In order to avoid the above, SEO engineers developed alternative techniques that replace nofollowed tags with obfuscated Javascript and thus permit PageRank sculpting. Additionally several solutions have been suggested that include the usage of iframes, Flash and Javascript.
In December 2009 Google announced it would be using the web search history of all its users in order to populate search results.
Real-time-search was introduced in late 2009 in an attempt to make search results more timely and relevant. Historically site administrators have spent months or even years optimizing a website to increase search rankings. With the growth in popularity of social media sites and blogs the leading engines made changes to their algorithms to allow fresh content to rank quickly within the search results.
http://en.wikipedia.org/wiki/Search_engine_optimization
-
By 1997 search engines recognized that webmasters were making efforts to rank well in their search engines, and that some webmasters were even manipulating their rankings in search results by stuffing pages with excessive or irrelevant keywords. Early search engines, such as Infoseek, adjusted their algorithms in an effort to prevent webmasters from manipulating rankings.
Due to the high marketing value of targeted search results, there is potential for an adversarial relationship between search engines and SEO service providers. In 2005, an annual conference, AIRWeb, Adversarial Information Retrieval on the Web, was created to discuss and minimize the damaging effects of aggressive web content providers.
Companies that employ overly aggressive techniques can get their client websites banned from the search results. In 2005, the Wall Street Journal reported on a company, Traffic Power, which allegedly used high-risk techniques and failed to disclose those risks to its clients. Wired magazine reported that the same company sued blogger and SEO Aaron Wall for writing about the ban.Google's Matt Cutts later confirmed that Google did in fact ban Traffic Power and some of its clients.
Some search engines have also reached out to the SEO industry, and are frequent sponsors and guests at SEO conferences, chats, and seminars. In fact, with the advent of paid inclusion, some search engines now have a vested interest in the health of the optimization community. Major search engines provide information and guidelines to help with site optimization. Google has a Sitemaps program to help webmasters learn if Google is having any problems indexing their website and also provides data on Google traffic to the website. Google guidelines are a list of suggested practices Google has provided as guidance to webmasters. Yahoo! Site Explorer provides a way for webmasters to submit URLs, determine how many pages are in the Yahoo! index and view link information. Bing Toolbox provides a way from webmasters to submit a sitemap and web feeds, allowing users to determine the crawl rate, and how many pages have been indexed by their search engine.
http://en.wikipedia.org/wiki/Search_engine_optimization
-
Thanks,
SEO! :) You posted a cool SEO info over here! :D I'll put some cool info about Alexa in the Alexa's board now :D
-
You're welcome, Alexa! SEO is always ready to help. ;D
-
May be, if you're reading about SEO, you often ask yourselves "How do search engines select which websites to display first?"
That's the million-dollar question! Since search engines primary service is their search service, the quality of their results is perhaps their most valuable asset. Think about it - if they were to disclose their ranking algorithms,then the unscrupulous webmasters would invariably use the information to secure top positions for all sorts of keywords and in the end, this would diminish the quality of the results, and Google, Baidu, Yandex, Goso, Bing, etc. would lose their pre-eminence in the field of search. As a result the algorithm is a tightly held secret!
-
FFA website or FFA site means "Free For All website (site)". They are sites that allow people to submit a link to them. These links are free of cost and you don't need to link back to the page, as it's usually required in the link exchange programs.
-
I wish i know all dis mehn.
-
Yes, I wish, too 8)
-
Hi, SEO masters! ;) Do you know about A/B testing, split testing or bucket testing (it's one and the same)? As SEO masters you should know it, so let me introduce this A/B testing thing to you-
"A/B testing, split testing or bucket testing is a method of marketing testing by which a baseline control sample is compared to a variety of single-variable test samples in order to improve response rates. A classic direct mail tactic, this method has been recently adopted within the interactive space to test tactics such as banner ads, emails and landing pages.
Significant improvements can be seen through testing elements like copy text, layouts, images and colors. However, not all elements produce the same improvements, and by looking at the results from different tests, it is possible to identify those elements that consistently tend to produce the greatest improvements.
Employers of this A/B testing method will distribute multiple samples of a test, including the control, to see which single variable is most effective in increasing a response rate or other desired outcome. The test, in order to be effective, must reach an audience of a sufficient size that there is a reasonable chance of detecting a meaningful difference between the control and other tactics: see Statistical power.
This method is different to multivariate testing which applies statistical modeling which allows a tester to try multiple variables within the samples distributed.
Companies well-known for using A/B testing
Many companies use the designed experiment approach to making marketing decisions. It is an increasingly common practice as the tools and expertise grows in this area. There are many A/B testing case studies which show that the practice of testing is increasingly becoming popular with small and medium businesses as well.
While it is widely used behind the scenes to maximize profits, the practice occasionally makes it into the spotlight.
-Amazon.com pioneered its use within the web ecommerce space. Also stirred controversy by testing into optimal price-points.
-BBC.
-Google. One of their top designers, Douglas Bowman, left and spoke out against excessive use of the practice.
-Microsoft
-Playdom (Disney Interactive)
-Zynga
-ebay.com
A/B Testing Resources
There are a handful of free and paid tools that help make A/B testing available to everyone, though some are best suited for large corporations and organizations.
ABTests.com is a website where people upload and analyze A/B tests that they've run on their own sites.
Adobe Omniture - Test&Target allow clients to A/B and multivariable tests. The main difference is that managers are allowed to do tests based on customers' details(web analytics' metrics) to deliver the tests.
Google Website Optimizer (GWO) is a free tool by Google that allows webmaster to split traffic across two or more pages using Javascript commands. It is a recommended option of users who are just getting started with A/B testing.
Lazzia is a simple tool for A/B testing images. It doesn't require Javascript and can automatically show the winning image once a trial has finished.
LiveBall is a powerful testing tool created by ion interactive that allows marketers to A/B or multivariate test their web pages. It comes with the option to automatically redirect traffic to the winning page once it has reached statistical significance. With LiveBall there's no need to know code, and there's no need for an advanced degree in mathematics in order to test and optimize web pages.
Optimizely is designed for powerful yet fast and easy A/B testing. Marketing and sales users can create and run experiments without writing any code by using the WYSIWYG editor, while advanced technical users can customize and fine tune experiments with Javascript.
Performable, unlike GWO, allows you to create landing pages and do A/B testing without any code or IT help. They have a library of custom templates and an interface to allow you to make your own. They also create a "social profile" of your visitors using information from Facebook, LinkedIn, Twitter and several other networks.
Unbounce is a platform that lets you create new landing pages and perform A/B tests on them. The WYSIWYG editor includes a suite of marketing-focused templates, allowing you to publish pages without any HTML coding or help from IT personnel.
Visual Website Optimizer is a paid alternative to GWO with many advanced features (such as WYSIWYG editor, heatmap reports and tagless integration) that makes it suitable for business who want flexibility while create A/B and multivariate tests.
VITES is a platform that allows companies to test visitor conversion rates across different profiles using server-side techniques.
SumoOptimize is an A/B testing tool providing users an easy way to manage and monitor their tests through visual editor.
Other Terms Used
-A/B/N Testing: A/B testing with more than two alternatives ("N" cells)
-A/B/..Z Testing: Same as above
-A/B/A: Only two alternatives, but one is repeated. This enables a quick visual of when the test reaches significance.
-Multivariate Testing: A designed experiment where effects from two or more potential causal factors can be isolated from one another.
-Multivariant Testing: same as above."
Link - http://en.wikipedia.org/wiki/A/B_testing
-
Search engine results page
or SERP
It's time to show you one of the "magic" ;D SEO words and its meaning. I mean SERP
A search engine results page (SERP), is the listing of web pages returned by a search engine in response to a keyword query. The results normally include a list of web pages with titles, a link to the page, and a short description showing where the Keywords have matched content within the page. A SERP may refer to a single page of links returned, or to the set of all links returned for a search query.
Query caching
Actually some search engines cache SERPs for frequent searches and display the cached SERP instead of a live SERP to increase the performance of the search engine. The search engine updates the SERPs periodically to account for new pages, and possibly to modify the rankings of pages in the SERP.
SERP refreshing can take several days or weeks which can occasionally cause results to be inaccurate or out of date, and new sites and pages to be completely absent.
Different types of results
SERPs of major search engines like Google, Yahoo! and Bing may include different types of listings: contextual, algorithmic or organic search listings, as well as sponsored listings, images, maps, definitions, videos or suggested search refinements.
The major search engines visually differentiate specific content types, such as images, news, and blogs. Many content types have specialized SERP templates and visual enhancements on the main search result page.
Advertising (Sponsored listings)
SERPs may contain advertisements. This is how commercial search engines fund their operations. Common examples of these advertisements are displayed on the right hand side of the page as small classified style ads or directly above the main organic search results on the left.
How SERP entries are generated
Major search engines like Google, Yahoo! and Bing primarily use content contained within the Metadata tags of a web page to generate the content that makes up a search snippet.The title tag will be used as the title of the snippet while the most relevant or useful contents of the web page (description tag or page copy) will be used for the description. If the web page is not available, information about the page from dmoz may be used instead.
SERP tracking
Webmasters use Search engine optimization to increase their website's ranking on a specific keyword's SERP. As a result, webmasters often check SERP's to track their Search engine optimization progress. To speed up the tracking process, programmers created automated software to track multiple keywords for multiple websites.
http://en.wikipedia.org/wiki/Search_engine_results_page
-
Something about the right spelling of this word. I saw in some websites they're spelling Search Engine opimization as "Seo", but it's not right. The right one is "SEO". Remember:
1.SEO = Search Engine Optimization or Search Engine Optimizator.
2.Seo = a Korean name (like others Korean names as: Li, Pak /Park/, Kim, Cha, etc.)
-
my wish too much all SEO.....
-
Well said, Marcrnandez ! "All SEO". I like it! 8) I'll do my best to show you ALL SEO :D
-
Content farm
In the context of the World Wide Web, a content farm is a company that employs large numbers of often freelance writers to generate large amounts of textual content which is specifically designed to satisfy algorithms for maximal retrieval by automated search engines. Their main goal is to generate advertising revenue.
The articles in content farms often poach from other media sources, leading to disputes over copyright infringement. They are written by human beings but may not be written by a specialist in the area. Proponents of the content farms claim that from a business perspective, traditional journalism is inefficient: stories are chosen by a small group of people that frequently have similar experiences and outlooks. Content farms often commission their writers' work based on analysis of search engine queries that proponents represent as "true market demand", a feature that traditional journalism lacks.
Content farms are criticized for providing relatively low quality content as they maximize profit by producing just "good enough" rather than best possible quality articles. Authors are aware that the quality is not that good. Search engines see content farms as a problem, as they tend to bring the user to the less relevant and lower quality results of the search. Because of the attempt to deliver as much as possible and as cheaply as possible, content farms are called "McDonalds online".
In one of Google's promotional videos for search, the majority of the links available were reported to be produced at content farms.
Content farms contain huge number of articles. For instance, Demand Media will soon be publishing 1 million items a month, the equivalent of four English-language Wikipedias a year. Big content farms are expensive resources, sold for many millions.
A content farm writer usually gets only several dollars per article yet produces many articles per day and may earn enough for living. A typical content writer is a female with children that contrasts with sites expecting voluntary unpaid contribution for the sake of idea, where the typical writer is an unmarried (single) male.
http://en.wikipedia.org/wiki/Content_farm
-
I like your forum post. I got to know about the history of seo science on the this page
Seo Services (http://"http://www.google.com/services.html")
-
Patelrocky, welcome here, in our SEO and Non-SEO forums! You may post your SEO Services also in our SEO-Social Network and our SEO directory. :D And now I'll go on to write about the SEO science over here. :)
-
White hat and Black hat SEO. Let's learn more about them here.
White or black hat
In recent years, the terms white hat and black hat have been applied to the Search Engine Optimization (SEO) industry. Black hat SEO tactics such as spamdexing, attempt to redirect search results to particular target pages in a fashion that is against the search engines' terms of service, whereas white hat methods are generally approved by the search engines. White hats tend to produce results that last a long time, whereas black hats anticipate that their sites may eventually be banned either temporarily or permanently once the search engines discover what they are doing.
Some of the blackhat SEO tactics include: keyword stuffing, hidden text and links, doorway and cloaked pages, link farming and blog comment spam.
White Hat Marketing
White hat marketing applies the White hat SEO techniques, also known as ethical SEO. The white hat marketing implies that all SEO activities are carried out while conforming to the guidelines, rules and policies of search engines. It is an ethical guideline since all site managers abide to the written, as well as unwritten rules and guidelines for SEO. Some of these guidelines are:
-Providing relevant keywords that cater for short and long tails results
-Updating the content regularly
-Analysing search results analytics reports and take corrective actions as required
-Providing links to other websites as well as requesting other networks to link to this website
Black Hat Marketing
Black hat marketing involves SEO activities that are against the norms of search engines. Hence, black hat marketing is unethical. It is difficult for the search engine alone to distinguish when black hat SEO is applied. Competitors can play a role by reporting cases of black hat marketing to the search engines, who will in turn ban or penalise the website. Despite the risk of ban, marketers still can go for black hat marketing because it helps in boosting up the page location in search result. Example of Black hat marketing are:
-keyword stuffing - keywords can be provided, among others, in meta tags,alt tags, comment tags, as invisible text to human eyes. By overusing the same keywords throughout a web page, search engines algorithm, that reads keywords, will drive the web page up in its search result.
-doorway and cloaked pages - web site contain web pages that are listed in search result. However, when entering these pages, users are redirected to other pages. Hence, the search result contents do not match the page displayed to users.
-link farming
-hiding text - Hiding text in the page or website is also considered as black hat seo technique. This technique is considered as spam and search engine can ban the website.
Gray Hat Marketing
Gray hat marketing lies between white hat marketing and black hat marketing. Here, the site owner provides greater risks than white hat marketing, by disobeying to some of the search engine guidelines, but at the same time ensures limited disobediance so as not to fail into black hat marketing. For example, gray hat marketing may involve a keyword density higher than required but not too excessive.
Website involved in gray hat marketing may face penalty but may be safe from ban.
http://en.wikipedia.org/wiki/White_or_black_hat
-
I will write something about the squeeze page in "Internet". I mention it here, as far as the squeeze pages got something in common with the SEO (Search Engine Optimisation).
-
Backlink (-s)
Backlinks are incoming links to a website or web page. Inbound links were originally important (prior to the emergence of search engines) as a primary means of web navigation; today their significance lies in search engine optimization (SEO). The number of backlinks is one indication of the popularity or importance of that website or page (for example, this is used by Google to determine the PageRank of a webpage). Outside of SEO, the backlinks of a webpage may be of significant personal, cultural or semantic interest: they indicate who is paying attention to that page.
In basic link terminology, a backlink is any link received by a web node (web page, directory, website, or top level domain) from another web node. Backlinks are also known as incoming links, inbound links, inlinks, and inward links.
Search engine rankings
Search engines often use the number of backlinks that a website has as one of the most important factors for determining that website's search engine ranking, popularity and importance. Google's description of their PageRank system, for instance, notes that Google interprets a link from page A to page B as a vote, by page A, for page B. Knowledge of this form of search engine rankings has fueled a portion of the SEO industry commonly termed linkspam, where a company attempts to place as many inbound links as possible to their site regardless of the context of the originating site.
Websites often employ various techniques (called search engine optimization, usually shortened to SEO) to increase the number of backlinks pointing to their website. Some methods are free for use by everyone whereas some methods like linkbaiting requires quite a bit of planning and marketing to work. Some websites stumble upon "linkbaiting" naturally; the sites that are the first with a tidbit of 'breaking news' about a celebrity are good examples of that. When "linkbait" happens, many websites will link to the 'baiting' website because there is information there that is of extreme interest to a large number of people.
There are several factors that determine the value of a backlink. Backlinks from authoritative sites on a given topic are highly valuable.
If both sites have content geared toward the keyword topic, the backlink is considered relevant and believed to have strong influence on the search engine rankings of the webpage granted the backlink. A backlink represents a favorable 'editorial vote' for the receiving webpage from another granting webpage. Another important factor is the anchor text of the backlink. Anchor text is the descriptive labeling of the hyperlink as it appears on a webpage. Search engine bots (i.e., spiders, crawlers, etc.) examine the anchor text to evaluate how relevant it is to the content on a webpage. Anchor text and webpage content congruency are highly weighted in search engine results page (SERP) rankings of a webpage with respect to any given keyword query by a search engine user.
Increasingly, inbound links are being weighed against link popularity and originating context. This transition is reducing the notion of one link, one vote in SEO, a trend proponents hope will help curb linkspam as a whole.
It should also be noted that building too many backlinks over a short period of time can get a website's ranking penalized, and in extreme cases, the website is de-indexed altogether. Anything above a couple of hundred a day is considered "dangerous".
TechnicalWhen HTML (Hyper Text Markup Language) was designed, there was no explicit mechanism in the design to keep track of backlinks in software, as this carried additional logistical and network overhead.
Some website software internally keeps track of backlinks. Examples of this include most wiki and CMS software.
Most commercial search engines provide a mechanism to determine the number of backlinks they have recorded to a particular web page. For example, Google can be searched using link:wikipedia.org to find the number of pages on the Web pointing to http://wikipedia.org/. To find link information on Yahoo type linkdomain:http://www.wikipedia.org. Google only shows a small fraction of the number of links pointing to a site. It credits many more backlinks than it shows for each website.
Other mechanisms have been developed to track backlinks between disparate webpages controlled by organizations that aren't associated with each other. The most notable example of this is TrackBacks between blogs.
http://en.wikipedia.org/wiki/Backlink
-
"Click here"
"Click here" is a verb phrase that may be used as the anchor text of a hyperlink on a web page. The World Wide Web Consortium, through its Quality Tips for Webmasters, advises web designers to avoid using "click here" for this purpose.
Jakob Nielsen, a leading web usability pundit, says, "Don't use 'click here' or other non-descriptive link text."
Search indexing
Search engines use anchor text to index the content of a linked-to site. For example, a site that is linked with the same anchor text phrase by many other sites may appear towards the top of searches for that phrase. Some bloggers have speculated that using "click here" in lieu of a descriptive name is a poor search engine optimization practice.
Accessibility and device dependence
Screen readers, used by the visually impaired, can read out only the hyperlinks on the page as a quick method of navigation. Usability and accessibility firm Webcredible advises avoiding non-descriptive link text such as "click here" at all costs, as it makes no sense out of context.
Users may want to print web pages for reference. "Click here" is inapplicable on the printed page. For this reason, Tim Berners-Lee, the inventor of the Web, advises web designers to try to avoid references in the text to online aspects.
http://en.wikipedia.org/wiki/Click_here
-
Speed time is one of the algorithms.
"Slow sites may be penalized by search engines." (http://www.alexa.com)
-
Speed time is one of the algorithms.
"Slow sites may be penalized by search engines." (http://www.alexa.com)
FOR SURE!
-
Thanks for sharing good SEO information :)
-
Thanks for sharing good SEO information :)
Welcome SEO friend. And you're welcome :) !
-
The Duplicate Content Penalty Myth
Mar 15, 2007 at 10:00am ET by Jill Whalen
http://searchengineland.com/the-duplicate-content-penalty-myth-10741
One thing that has plagued the SEO industry for years has been a lack of consistency when it comes to SEO terms and definitions. One of the most prevalent misnomers being bandied about is the phrase "duplicate content penalty." I’m here to tell you that there is no such thing as a search engine penalty for duplicate content. At least not the way many people believe there is.
Don’t get me wrong; I’m not saying that the search engines like and appreciate duplicate content — they don’t. But they don’t specifically penalize websites that happen to have some duplicate content.
Duplicate content has been and always will be a natural part of the Web. It’s nothing to be afraid of. If your site has some dupe content for whatever reason, you don’t have to lose sleep every night worrying about the wrath of the Google gods. They’re not going to shoot lightning bolts at your site from the sky, nor are they going to banish your entire website from ever showing up when someone searches for what you offer. The duplicate content probably won’t show up in searches, but that’s not the same thing as a penalty.
Let me explain.
The search engines want to index and show to their users (the searchers) as much unique content as algorithmically possible. That’s their job, and they do it quite well considering what they have to work with: spammers using invisible or irrelevant content, technically challenged websites that crawlers can’t easily find, copycat scraper sites that exist only to obtain AdSense clicks, and a whole host of other such nonsense.
There’s no doubt that duplicate content is a problem for search engines. If a searcher is looking for a particular type of product or service and is presented with pages and pages of results that provide the same basic information, then the engine has failed to do its job properly. In order to supply its users with a variety of information on their search query, search engines have created duplicate content "filters" (not penalties) that attempt to weed out the information they already know about. Certainly, if your page is one of those that is filtered, it may very well feel like a penalty to you, but it’s not – it’s a filter.
Search engine penalties are reserved for pages and sites that are purposely attempting to trick the search engines in one form or another. Penalties can be meted out algorithmically when obvious deceptions exist on a page, or they can be personally handed out by a search engineer who discovers an infraction through spam reports and other means. To many people’s surprise, penalties rarely happen to the average website. Most that receive a penalty know exactly what they did to deserve it.
Honestly, the search engines are not out to get you. Matt Cutts isn’t plotting new ways to take food off your table. If you have a page on your site that sells red widgets and another very similar page selling blue widgets, you aren’t going to find your site banished off the face of Google because of this. The worst thing that will happen is that only the red widget page may show up in the search results instead of both pages showing up.
On the other hand, if you’ve created a Mad Libs spam site — i.e., one that uses a pre-written template where specific keyword phrases are substituted out for other ones — the pages in question might get filtered out completely. Not so much because of their dupe content (although that’s part of it), but because it’s search engine spam (low-quality pages with little value to people, created solely for search engine rankings).
The bottom line is that the engines are actively seeking out lousy content and removing it from their main results. If this sounds like your site, don’t be surprised to wake up one day and find you’ve lost some or all of your rankings. It’s time to bite the bullet and use them as PPC landing pages instead. There’s definitely some irony in the fact that those types of pages are welcome in Google if you’re willing to pay for each clickthrough you receive, but those are obvious moneymaker pages, and Google has a right to demand their cut.
Regionalized pages are another duplicate-content "spam" model that has been losing ground with the engines lately. Those consist of hundreds of pages/sites selling the same basic thing, but they are targeted to every city in the US. Unfortunately, there’s no easy answer to how to create high-quality pages that do the same thing.
Suffice it to say that just about any content that is easily created without much human intervention (i.e., automated) is not a great candidate for organic SEO purposes.
Another duplicate-content issue that many are concerned about is the republishing of online articles. Reprinting someone’s article on your site is not going to cause a penalty. At best, your page with the article will show up in a search related to it; at worst, it won’t. No big deal either way.
If your own bylined articles are getting published elsewhere, that’s a good thing. There’s no need for you to provide a different version to other sites or to not allow them to be republished at all. The more sites that host your article, the more chances you will have to build your credibility as well as to gain links back to your site through a short bio at the end of the article. If the site your article is hosted on shows up instead of yours, so be it. There’s nothing wrong with that, as your site can be easily clicked to from your bio; the pros far outweigh the cons. In many cases, Google still shows numerous instances of articles in searches, but even if they eventually show only one version, that’s still okay.
When it comes to duplicate content, the search engines are not penalizing you or thinking that you’re a spammer; they’re simply trying to show some variety in their search results pages.
Jill Whalen is owner of High Rankings, a search engine optimization firm founded in 1995. She speaks and writes regularly on SEO issues and also maintains the High Ranking Forums, where the community over of 10,000 members discusses SEO topics. The 100% Organic column appears Thursdays at Search Engine Land.
Opinions expressed in the article are those of the guest author and not necessarily Search Engine Land.
http://searchengineland.com/the-duplicate-content-penalty-myth-10741
-
Testing your website in multiple browsers
Edit: the link removed, because it shows now "Sorry, we couldn’t find that…
Please try again later or try searching for it." When it comes to testing your website in multiple browsers, there are a lot of websites to help you do so. Just take a look: http://cn.bing.com/search?q=Testing+your+website+in+multiple+browsers&go=Search&qs=n&form=QBRE&sp=-1&pq=testing+your+website+in+multiple+browsers&sc=1-41&sk=&cvid=3F506B0E190E45B18ADD89551C6FB055 (http://cn.bing.com/search?q=Testing+your+website+in+multiple+browsers&go=Search&qs=n&form=QBRE&sp=-1&pq=testing+your+website+in+multiple+browsers&sc=1-41&sk=&cvid=3F506B0E190E45B18ADD89551C6FB055) 8)
-
Authority Sites
What are they?
-
Authority Sites
What are they?
The authority sites are these, able to get an extra ranking boost from the search engines. Examples: wikipedia.org, mattcutts.com, seo-forum-seo-luntan.com, etc.
-
Do you know what is a mfa (MFA) site? MFA = "Made For AdSense" or "Made For Ads".
Now you know :)
-
&PWS=0
or PWS=0
Do you know these &PWS=0 (or PWS=0)? They are about to turning off the Google's personalized results. I suggest you a good article about it -
Turning Off Google's Personalized Results With PWS=0. You can read it there:
http://www.seroundtable.com/google-personalized-pws-13224.html
-
Google bowling
If you are learning SEO science you have to know this SEO phenomenon, called Google bowling.
"By studying what types of ranking manipulations a search engine is punishing, a company can provoke a search engine into lowering the ranking of a competitor's website. This practice, known as Google bowling, is often done by purchasing Google bombing services (or other SEO techniques) not for one's own website, but rather for the website of the competitor. The attacker provokes the search company into punishing the "offending" competitor by displaying their page further down in the search results. For victims of Google bowling, it may be difficult to appeal the ranking decrease because Google avoids explaining penalties, preferring not to "educate" real offenders. However if the situation is clear-cut, Google could lift the penalty after submitting a request for reconsideration."http://en.wikipedia.org/wiki/Google_bomb
-
Seo is the process of improving the visibility of a website or a web page in google via the natural or un-paid google search. Other forms of search engine marketing (SEM) target paid listings.
-
It's "SEO", not "Seo". "Seo" is a Korean family name.
-
this sort of industry come into existence in ninties
-
I want to write content for SEO firms and content firms.how can I find them?
-
I want to write content for SEO firms and content firms.how can I find them?
Your question is an off topic, because it is not related to this SEO science topic.
-
The light loath advertising and marketing signifies that just about all SEO routines are usually performed while contouring for the rules, regulations in addition to insurance policies connected with search engines like yahoo. It really is an honourable guideline because just about all internet site supervisors follow for the published, in addition to unwritten regulations in addition to rules for SEO.
-
SEO or search engine optimisation is not rocket science, but for some companies this technology sounds too much complex process.
-
But your BAN isn't complex. See about it: http://www.seo-forum-seo-luntan.com/else-topics/forum-%28luntan%29-problems/msg13827/#msg13827
-
SEO or search engine optimization is the process of improving website popularity and ranking in the top search engines like Google, Yahoo, MSN, Bing etc. It can help your business website in improving online traffic, if done properly.
-
Search engine optimization (SEO) can help a website or business increase internet traffic when properly applied.
-
Site-wide links
What are the site-wide links
http://www.webopedia.com/TERM/S/sitewide.html
http://forums.seochat.com/link-popularity-43/what-site-wide-links-33741.html
And about how Google handles site-wide links both algorithmically and manually -- http://www.webpronews.com/matt-cutts-on-how-google-handles-site-wide-links-both-algorithmically-and-manually-2012-11
-
Never know about such thing like SEO birthday, never thought of it too..
-
Edit: no spam, please. Write more informative and relevant posts. No "please visit this site; visit this site, please; etc.".
-
Edit: Welcome to our big BAN list, spammer!
Admin
-
SEO or search engine optimisation is not rocket science, but for some companies this technology sounds too much complex process.
-
russellester said the same! You repeated/pasted the same, i.e. you're duplicating the content. So, you will get the ban, odiswwyp! Welcome to our big ban list!
-
100 Lessons Learned from 10 Years of SEO
Hello, SEO learners and SEO experts! Something useful for all of you: 100 Lessons Learned from 10 Years of SEO:
http://www.quicksprout.com/2012/07/16/100-lessons-learned-from-10-years-of-seo/
Enjoy! :)
-
:) :)
-
nicolehong, start posting more meaningful or you'll join our big BAN list.
-
I would like to learn it . ;D
-
You learned it already. You're banned. :D
-
And now SEO is helping us a lot especially with our business in the line community.
-
hreflang
Try to post ontopic and useful comments, steve18!
Now about the hreflang. A link to Wikipedia: https://en.wikipedia.org/wiki/Hreflang (a good useful article, if you need to know about hreflang).
-
SEO is an acronym for "search engine optimization" or "search engine optimizer". In this forum it's also for "Science, Education, Omnilogy" (or/and "Study, Education, Omnilogy").
-
&start=990
Discover YOUR TRUE Competition &start=990
Discover YOUR TRUE Competition &start=990 - http://www.warriorforum.com/main-internet-marketing-discussion-forum/113424-discover-your-true-competition-start-990-a.html
-
What is Ethical Link Building?
1.
What is Ethical Link Building?
Postby punk58 » Wed Oct 10, 2012 5:05 pm
I hear mention of ethical link building and wonder what exactly is this supposed to mean? How do you define ethical link building? What is unethical link building?
Wordpress Plugin Development
2.
Re: What is Ethical Link Building?
Postby edword84 » Fri Oct 12, 2012 10:10 am
Link building by doing white hat techniques called ethical link building. It is beneficial for your site's future.
3.
Re: What is Ethical Link Building?
Postby tonyclains » Tue Oct 16, 2012 5:25 am
Link building is the most important in Search Engine Optimization. Ethical link building also known as natural link building. In ethical link building web master follow the search engine rules and regulation for getting back links for his site.
Tanzania real estate
4.
Re: What is Ethical Link Building?
Postby huarong » Tue Oct 16, 2012 5:46 am
For me forum posting give me good ranking against keywords and give me good numbers of backlinks. Forum can act as an ideal marketing tool for your business. It helps in learning the avenues for your business through the constant interaction with other members of the forum. Forum discussion finds it’s receptive or the readers of its content, the more hot the discussion, the more traffic you can create. .
pvc plastic pipe
5.
Re: What is Ethical Link Building?
Postby huarong » Tue Oct 16, 2012 6:07 am
Link exchange is that- the two websites are favoring each other and clearly telling search engines that they do not believe in gaining authority naturally.
pvc plastic pipe
6.
Re: What is Ethical Link Building?
Postby Freddyaiken » Mon Dec 31, 2012 1:17 pm
Ethical Link building is one of the best technique in SEO for getting Natural ranking. It is also known as white hat techniques or Natural Technique.
Planet Infographic
7.
What is Ethical Link Building?
Postby thindseo14 » Wed Jan 02, 2013 8:56 am
Ethical link building is natural link building. It is best technique in the SEO for ranking and traffic. Backlinks come to the website fast for ranking and traffic.
Foam Machinery
...
-
Best way of fast crawling
1.
Best way of fast crawling
Postby punk58 » Thu Sep 20, 2012 2:13 pm
Hi everybody!!!
Can you experts help me out in SEO about the best activities for fast web crawling?
I need my web sites to be crawled on a regular basis and that is in a very fast way.
Email Marketing India
2.
Re: Best way of fast crawling
Postby RichardSmith » Wed Sep 26, 2012 4:30 pm
Hello to Friends,
You should do following activities for fast crawling:
> Article Promotion
> Social Bookmarking in high page rank
> Blog Commenting
Thanks and Regards
Richard Smith
Reputation Management Services
3.
Re: Best way of fast crawling
Postby hardware-rigging » Sat Sep 29, 2012 6:43 am
Back links are very much helpful to get Page Rank and also helpful to achieve top ranking. All the search engines are not dependent on the back links but they are also looks into the content, design and traffic. As per my view back links mainly matters for the Google as compared other search engines link Bing and Yahoo, they works on different ranking algorithms. Back links are important but not the only important thing in SEO there are so many other things which also matters in getting top ranking.
Hardware Rigging
4.
Re: Best way of fast crawling
Postby LiamObraim » Tue Oct 02, 2012 9:10 am
As per my experience with various websites I suggest you to do social book marking, directory submission and forum posting for getting quickly crawling by the search engine. These techniques are commonly used for getting crawl.
Ecommerce Seo
5.
Re: Best way of fast crawling
Postby Charls051 » Sat Oct 06, 2012 11:20 am
Social bookmarking, directory submission, forum, blog is the best option for crawling quickly and give result initially.
PSD to HTML CSS
6.
Re: Best way of fast crawling
Postby mobapps » Mon Oct 08, 2012 4:44 am
I suggest making your product or website more popular through quality content marketing. You can boost your website through content marketing. Because Google is given more emphases in quality content.
iPhone Apps Development
7.
Best way of fast crawling
Postby thindseo12 » Fri Oct 12, 2012 5:42 am
Submit the daily submissions various articles, Directories and Forums is helpful for archive for ranking and traffic. Google is fast crawl after the daily submissions in the SEO.
8.
Best way of fast crawling
Postby thindseo14 » Tue Oct 23, 2012 11:21 am
Daily submissions of the bookmarking, forums, directories come the ranking fast on the Google and easy to indexing by the Google. All activities are perform on the seo website.
Foam Machinery
...
-
I wish i know all dis mehn.
It's kind of sad, ola. You were one of our earliest members, but you posted here only once. And I'm sad also to see your forum site isn't existing anymore. I remember some of your posts. I'll share them here with the others, who never knew it.
From ola's in 2011: "FIRE GUTS PRESIDENTIAL AMNESTY OFFICE
Early Wednesday morning, the Presidential Amnesty Office, Abuja was burnt causing substantial damage to official documents and property.
The inferno which started at about 1.30 a.m. was suspected to have resulted from a surge in electricity.
The fire in the building which houses the Office of the Special Adviser to the President on Niger Delta, started in one of the offices in the top floor.
It took men of the Federal Fire Service and the Fire Service of the Federal Capital Territory about four and a half hours to put out the fire.
A press statement signed by Mr. Henry Ugholue, head, Media and Publicity of the office quoted the Special Adviser to the President on Niger Delta and the Chief Executive Officer of the Amnesty Programme, Honourable Kingsley Kuku, as assuring all stakeholders and the general public that the incident will not adversely affect service delivery at the office as all sensitive operational documents regarding the Amnesty Programme have been confirmed to be intact.
Kuku, according to the statement further assured that in a matter of days, the physical structures of the burnt offices would be restored."
"COURT ASKED TO STOP APRIL PRESIDENTIAL ELECTION
The Peoples Mandate Party (PMP) and its presidential candidate during the 2007 presidential election, Dr. Arthur Nwankwo, on Wednesday, asked the Court of Appeal, Abuja Division, sitting as the Presidential Election Tribunal to stop the forthcoming presidential elections.
The presidential candidate of PMP had, through his counsel, Nnabuike Edechime, gone to the Court of Appeal seeking to annul the election of Umaru Yar’Adua and Goodluck Jonathan, alleging that the process that brought them to power contravened the electoral laws.
He claimed that there were electoral inconsistencies and irregularities in the said polls and concluded that the best thing for the court to do was to pronounce the election a nullity.
In an application filed by the party on January 18, 2011, Nwankwo and PMP asked the court for an order of interlocutory injunction restraining INEC from conducting any fresh presidential election in Nigeria on April 2011 or any other date.
They further sought an order of interlocutory injunction restraining Jonathan from presenting himself or allowing himself to be presented to INEC as a candidate in any presidential election to be held in Nigeria on any date pending the determination of the petition."
If I have the chance, I'll update it with more.
-
SEOless vs SEO
In some cases SEOless is better than SEO.
-
RankBrain algorithm
Learn (if you still didn't) about the RankBrain algorithm.
RankBrain is an algorithm learning artificial intelligence system, the use of which by Google was confirmed on 26 October 2015. It helps Google to process search results and provide more relevant search results for users. In a 2015 interview, Google commented that RankBrain was the third most important factor in the ranking algorithm along with links and content.
If RankBrain sees a word or phrase it isn’t familiar with, the machine can make a guess as to what words or phrases might have a similar meaning and filter the result accordingly, making it more effective at handling never-before-seen search queries.
There are over 200 different ranking factors which make up the ranking algorithm, of which their exact functions in the Google algorithm are not fully disclosed. It seems that RankBrain interprets the user searches to find pages that may not have contained the exact words that were used in the user search query. When offline, RankBrain is given batches of past searches and learns by matching search results. Once RankBrain's results are verified by Google's team the system is updated and goes live again.
https://en.wikipedia.org/wiki/RankBrain https://en.wikipedia.org/wiki/Wikipedia:Text_of_Creative_Commons_Attribution-ShareAlike_3.0_Unported_License
-
30 SEO Terms Every Non-SEO Should Know - from Website Magazine
30 SEO Terms Every Non-SEO Should Know - from Website Magazine:
https://www.linkedin.com/pulse/30-seo-terms-every-non-seo-should-know-from-website-el-chammas
-
Almost 95% of newly published pages don’t get to the Top10 within a year. This and more: https://ahrefs.com/blog/how-long-does-it-take-to-rank/ /How long does it take to rank in Google? (A study by Ahrefs)
Tim Soulo February 6, 2017/
-
Hi SEO, What are the new link building methods?
EDIT: The best method is not to spam. I deleted now your spamlink here. Spam us once more and get ban!
-
The best method is "no spam". Next spam = 100% ban! 8)
-
So much informative info about SEO, Great job experts. Keep us updated on latest trends and news.
-
Thanks! :):)
-
SEO evergreen content
What's the evergreen content in SEO (or SEO evergreen content): it is every content that is always relevant (continually relevant); does not become dated.
-
What are the SEO footprints (footprints in SEO)
SEO footprints (footprints in SEO) -- A SEO footprint is the imprint you leave on the Internet that can be used to trace your SEO activity through various sites. (It can be used to locate multiple accounts and multiple sites you or your SEO company own/owns, i. e. a SEOer /search engine optimizer/ or his/her SEO company)
There are good SEO footprints and bad SEO footprints.
-
Information about the Google rich answer box (http://www.seo-forum-seo-luntan.com/seo/what-is-googles-rich-answer-box/).
-
Information about the Google penalty (http://www.seo-forum-seo-luntan.com/seo/what-is-google-penalty/).
-
"Thin content"
What is thin content in SEO? It's low quality webpages/websites that add little to no value to the reader.
-
Splog = spam blog
Splog means spam blog. More about this term, in Wikipedia: https://en.wikipedia.org/wiki/Spam_blog
-
Google Fred Update
I read -- one of the newest Google updates is the Google 'Fred' update.
Some links about it: https://www.quora.com/What-is-the-latest-Google-Fred-Update-2017, https://searchengineland.com/googles-fred-update-hit-low-value-content-sites-aimed-revenue-helping-users-271165, https://www.seroundtable.com/google-algorithm-ranking-update-23523.html.
-
From an old e-mail (Oct 15, 2010 at 6:34 AM):
"5 Mistakes That Spell Disaster for Your Website
To give your site’s visitors the best possible experience, avoid these five common mistakes:
Stale Content
Content Overload
No Photos
Looking Illegitimate
Being Bland"
:)
-
Technical optimization — the activity of constructing websites and webpages so that search engines can successfully crawl, index, and elevate them — is referred to as SEO. Website organization, structured data, and web-page performance are all examples of this.
-
Top 20 search engine crawlers
The top 20 search engine crawlers -- https://softwarequickguide.com/top-20-search-engine-crawler-bots (https://softwarequickguide.com/top-20-search-engine-crawler-bots). 8)
-
nice content
-
Content is nice but your spam wasn't. Deleted.
Now I will delete your next spam post. Once more spam = ban.
-
Google AdSense First-Party Cookie Controls Are Changing
More: https://www.seroundtable.com/google-adsense-first-party-cookie-controls-38401.html