SEO->Search engine optimization
"SEO" redirects here. For other uses, see SEO (disambiguation).
Search Engine Optimization (SEO) is the process of improving the visibility of a website or a web page in the "natural" or unpaid search engine (search results "organic" or "algorithmic"). In general, the earlier (or higher rated on the side of search results), and more frequently a site appears in the results list, the more visitors will receive from the search engine users. SEO can be different kinds of search, including image search destination, local search, video search, academic search, [1 Alat Antrian] news search and industry-specific vertical search engines.
As a marketing strategy, SEO considers how search engines work, what people look for, the actual search terms or keywords typed into search engines and search engines by their target audience preferred. Optimizing a website, it may be increased by editing its content and HTML and associated coding to both remove their relevance to specific keywords and barriers to the indexing activities of search engines. Promoting a site to increase the number of backlinks or inbound links, is another SEO tactic.
Can refer to the acronym "SEO" "search engine optimizers," a term the conduct by an industry consultant, optimization projects on behalf of clients accepted, and by employees who perform SEO services in-house. Search engine optimizers offer SEO as a stand-alone service or as part of a broader marketing campaign. Because effective SEO require changes to the HTML source code of a site and site content, SEO tactics may be incorporated into the development and website design. The "search engine friendly" may be used to website designs, menus, content management systems, images, videos, shopping carts, and describe other elements that have been optimized for the purpose of search engine exposure.
Search Engine Optimization (SEO) is the process of improving the visibility of a website or a web page in the "natural" or unpaid search engine (search results "organic" or "algorithmic"). In general, the earlier (or higher rated on the side of search results), and more frequently a site appears in the results list, the more visitors will receive from the search engine users. SEO can be different kinds of search, including image search destination, local search, video search, academic search, [1 Alat Antrian] news search and industry-specific vertical search engines.
As a marketing strategy, SEO considers how search engines work, what people look for, the actual search terms or keywords typed into search engines and search engines by their target audience preferred. Optimizing a website, it may be increased by editing its content and HTML and associated coding to both remove their relevance to specific keywords and barriers to the indexing activities of search engines. Promoting a site to increase the number of backlinks or inbound links, is another SEO tactic.
Can refer to the acronym "SEO" "search engine optimizers," a term the conduct by an industry consultant, optimization projects on behalf of clients accepted, and by employees who perform SEO services in-house. Search engine optimizers offer SEO as a stand-alone service or as part of a broader marketing campaign. Because effective SEO require changes to the HTML source code of a site and site content, SEO tactics may be incorporated into the development and website design. The "search engine friendly" may be used to website designs, menus, content management systems, images, videos, shopping carts, and describe other elements that have been optimized for the purpose of search engine exposure.
History
Webmasters and content providers began optimizing sites for search engines in the mid-1990s, as the first search engines were cataloging the early Web. Originally, all webmasters needed to do was submit the address of a page, or URL, engines which would send a "spider", are indexed to "crawl" that page, extract links to other pages and type of information found on the page. [2] The process involves a search engine spider downloading a page and stores it on the server's own search engine, where a second program, known as an indexer, extracts various information about the page, such as the word contains and where they are located, as well as any weight for certain words and all links containing the page, which is then arranged in a scheduler for crawling at a later date.
Site owners started, the importance of strong and its sites ranked and visible in search engine results, creating an opportunity to recognize both white hat and black hat SEO practitioners. According to the analyst Danny Sullivan, the phrase "search engine optimization" probably came into use in 1997. [3] The first documented use of the term Search Engine Optimization John Audette and his company Multimedia Marketing Group was documented by a web page from the MMG site from August, 1997. [4]
Early versions of search algorithms relied on webmaster available information such as keywords meta tag, or index files in engines like ALIWEB. Meta tags provide a guide to the content of each page. With meta data to index pages was found to be less reliable, but because the webmaster of the choice of keywords in the meta tag could be a misrepresentation of the actual content of the website. Inaccurate, incomplete and contradictory in meta tags could and possibly arrange pages for irrelevant searches. [5] [unreliable source?] Web content providers. Well as a number of attributes within the HTML source code of a page in an attempt to manipulate good rank in search engines [6]
Relying as much on factors such as keyword density which were suffered exclusively under the control of a webmaster, early search engines from abuse and ranking manipulation. For best results to their users, search engines had to adapt to ensure their results pages showed the most important results, rather than independent sites with keywords stuffed a lot of unscrupulous webmasters. Since the success and popularity of a search engine is determined by its ability to produce the most important results for a given query, so that these results could be found wrong with others search sources. Search engines responded by developing more complex ranking algorithms, taking into account additional factors that were difficult to manipulate for webmasters. Students developed at Stanford University, Larry Page and Sergey Brin, "Backrub", a search engine that is based on a mathematical algorithm to rate the importance of web pages. Is the number of the algorithm, PageRank determines a function of the quantity and strength of the incoming links. Mesin Antrian[7] PageRank estimates the likelihood that. A specific page from a web user who randomly be reached surfs the web, and follows links from one page to another In effect, this means that some links are stronger than others, as a higher PageRank page is more likely to be achieved by the random surfer.
Page and Brin founded Google in 1998. Google attracted a loyal following among the growing number of Internet users, who liked its simple design. [8] Off-page factors (such as PageRank and hyperlink analysis) as well as the sides factors (such as keyword frequency, meta tags, headings, links and site structure) to take into account so Google the type of manipulation in search engines, the only factors for their ranking sites as to avoid seen. Although PageRank was more difficult to game, webmasters had already link building tools and schemes to influence the Inktomi search engine developed, and these methods proved to be equally applicable to the PageRank game. Many sites on the exchange, buying and selling links, often aligned on a large scale. Some of these schemes, or link farms, it went to the creation of thousands of sites for the sole purpose of link spam. [9]
In 2004, search engines had incorporated a wide range of factors that are not known in their ranking algorithms to reduce the impact of link manipulation. Google says it ranks sites using more than 200 different signals. [10] The leading search engines, Google, Bing and Yahoo reveal, not the algorithms they use to count pages. SEO service providers, such as Rand Fishkin, Barry Schwartz, Aaron Wall and Jill Whalen, were examined different approaches to search engine optimization, and have published their opinions in online forums and blogs. [11] [12] SEO practitioners may also study, patents held by various search engines to better understand the algorithms. [13]
In 2005, Google began to personalize search results for each user. Depending on their history back, looking for Google-developed results for logged in users. [14] In 2008, Bruce Clay said that "ranking is dead" because of personalized search. It would become meaningless to discuss how a website ranked, because the rank is potentially different for each user and each search. [15]
In 2007 Google announced a campaign against paid links that transfer PageRank. [16] On 15 June 2009 Google revealed they had measures taken to mitigate the effects of PageRank sculpting by use of the nofollow attribute on links. Matt Cutts, a software engineer known in Google announced that Google Bot not treat avoid nofollow links in the same way to SEO service providers from using nofollow for PageRank sculpting. [17] As a result of this change the use of nofollow leads to evaporation of pagerank. To avoid the above, replace engineers developed alternative techniques that SEO nofollow tags with hidden and turn on Javascript PageRank sculpting. In addition, several solutions have been proposed which. Using frames, Flash and Javascript [18]
In December 2009, Google announced that they are using the web search history of all its users to populate search results. [19]
Google Instant has been real-time search launched in late 2010 in an attempt to reach more relevant and timely. Historically site administrators have spent months or years optimizing a website to increase search rankings. With the growing popularity of social media sites and blogs the leading engines made changes to their algorithms to allow new content quickly rank in search results. [20]
In February 2011, Google announced the update "Panda, the websites punished with duplicate content from other sites and sources. Historic sites have copied the contents of the other and benefited in the rankings of search engines by engaging in this practice, but Google on a new system, their content is not punished clearly implemented. [21]
In April 2012, Google launched the Google Penguin update the object of which was to sites that use manipulative techniques to punish in order to improve their ranking in the search engine.
Site owners started, the importance of strong and its sites ranked and visible in search engine results, creating an opportunity to recognize both white hat and black hat SEO practitioners. According to the analyst Danny Sullivan, the phrase "search engine optimization" probably came into use in 1997. [3] The first documented use of the term Search Engine Optimization John Audette and his company Multimedia Marketing Group was documented by a web page from the MMG site from August, 1997. [4]
Early versions of search algorithms relied on webmaster available information such as keywords meta tag, or index files in engines like ALIWEB. Meta tags provide a guide to the content of each page. With meta data to index pages was found to be less reliable, but because the webmaster of the choice of keywords in the meta tag could be a misrepresentation of the actual content of the website. Inaccurate, incomplete and contradictory in meta tags could and possibly arrange pages for irrelevant searches. [5] [unreliable source?] Web content providers. Well as a number of attributes within the HTML source code of a page in an attempt to manipulate good rank in search engines [6]
Relying as much on factors such as keyword density which were suffered exclusively under the control of a webmaster, early search engines from abuse and ranking manipulation. For best results to their users, search engines had to adapt to ensure their results pages showed the most important results, rather than independent sites with keywords stuffed a lot of unscrupulous webmasters. Since the success and popularity of a search engine is determined by its ability to produce the most important results for a given query, so that these results could be found wrong with others search sources. Search engines responded by developing more complex ranking algorithms, taking into account additional factors that were difficult to manipulate for webmasters. Students developed at Stanford University, Larry Page and Sergey Brin, "Backrub", a search engine that is based on a mathematical algorithm to rate the importance of web pages. Is the number of the algorithm, PageRank determines a function of the quantity and strength of the incoming links. Mesin Antrian[7] PageRank estimates the likelihood that. A specific page from a web user who randomly be reached surfs the web, and follows links from one page to another In effect, this means that some links are stronger than others, as a higher PageRank page is more likely to be achieved by the random surfer.
Page and Brin founded Google in 1998. Google attracted a loyal following among the growing number of Internet users, who liked its simple design. [8] Off-page factors (such as PageRank and hyperlink analysis) as well as the sides factors (such as keyword frequency, meta tags, headings, links and site structure) to take into account so Google the type of manipulation in search engines, the only factors for their ranking sites as to avoid seen. Although PageRank was more difficult to game, webmasters had already link building tools and schemes to influence the Inktomi search engine developed, and these methods proved to be equally applicable to the PageRank game. Many sites on the exchange, buying and selling links, often aligned on a large scale. Some of these schemes, or link farms, it went to the creation of thousands of sites for the sole purpose of link spam. [9]
In 2004, search engines had incorporated a wide range of factors that are not known in their ranking algorithms to reduce the impact of link manipulation. Google says it ranks sites using more than 200 different signals. [10] The leading search engines, Google, Bing and Yahoo reveal, not the algorithms they use to count pages. SEO service providers, such as Rand Fishkin, Barry Schwartz, Aaron Wall and Jill Whalen, were examined different approaches to search engine optimization, and have published their opinions in online forums and blogs. [11] [12] SEO practitioners may also study, patents held by various search engines to better understand the algorithms. [13]
In 2005, Google began to personalize search results for each user. Depending on their history back, looking for Google-developed results for logged in users. [14] In 2008, Bruce Clay said that "ranking is dead" because of personalized search. It would become meaningless to discuss how a website ranked, because the rank is potentially different for each user and each search. [15]
In 2007 Google announced a campaign against paid links that transfer PageRank. [16] On 15 June 2009 Google revealed they had measures taken to mitigate the effects of PageRank sculpting by use of the nofollow attribute on links. Matt Cutts, a software engineer known in Google announced that Google Bot not treat avoid nofollow links in the same way to SEO service providers from using nofollow for PageRank sculpting. [17] As a result of this change the use of nofollow leads to evaporation of pagerank. To avoid the above, replace engineers developed alternative techniques that SEO nofollow tags with hidden and turn on Javascript PageRank sculpting. In addition, several solutions have been proposed which. Using frames, Flash and Javascript [18]
In December 2009, Google announced that they are using the web search history of all its users to populate search results. [19]
Google Instant has been real-time search launched in late 2010 in an attempt to reach more relevant and timely. Historically site administrators have spent months or years optimizing a website to increase search rankings. With the growing popularity of social media sites and blogs the leading engines made changes to their algorithms to allow new content quickly rank in search results. [20]
In February 2011, Google announced the update "Panda, the websites punished with duplicate content from other sites and sources. Historic sites have copied the contents of the other and benefited in the rankings of search engines by engaging in this practice, but Google on a new system, their content is not punished clearly implemented. [21]
In April 2012, Google launched the Google Penguin update the object of which was to sites that use manipulative techniques to punish in order to improve their ranking in the search engine.