2DaMax Marketing | (800) 564-4898
Contact Us Today!
(800) 564-4898
https://sites.google.com/site/smallbusinessmarketingusa/
https://www.2damaxmarketing.com/
Search engine optimization (SEO) is the process of growing the quality and quantity of website traffic by increasing the visibility of a website or a web page to users of a web search engine.[1] SEO refers to the improvement of unpaid results (known as "natural" or "organic" results) and excludes direct traffic and the purchase of paid placement.
Additionally, it may target different kinds of searches, including image search, video search, academic search,[2] news search, and industry-specific vertical search engines.
Promoting a site to increase the number of backlinks, or inbound links, is another SEO tactic.
By May 2015, mobile search had surpassed desktop search.[3] As an Internet marketing strategy, SEO considers how search engines work, the computer-programmed algorithms that dictate search engine behavior, what people search for, the actual search terms or keywords typed into search engines, and which search engines are preferred by their targeted audience.
SEO is performed because a website will receive more visitors from a search engine when website ranks are higher in the search engine results page (SERP).
These visitors can then be converted into customers.[4] SEO differs from local Search engine optimization in that the latter is focused on optimizing a business' online presence so that its web pages will be displayed by search engines when a user enters a local search for its products or services.
The former instead is more focused on national or international searches.
Webmasters and content providers began optimizing websites for search engines in the mid-1990s, as the first search engines were cataloging the early Web.
Initially, all webmasters only needed to submit the address of a page, or URL, to the various engines which would send a web crawler to crawl that page, extract links to other pages from it, and return information found on the page to be indexed.[5] The process involves a search engine spider downloading a page and storing it on the search engine's own server.
A second program, known as an indexer, extracts information about the page, such as the words it contains, where they are located, and any weight for specific words, as well as all links the page contains.
All of this information is then placed into a scheduler for crawling at a later date.
Website owners recognized the value of a high ranking and visibility in search engine results,[6] creating an opportunity for both white hat and black hat SEO practitioners.
According to industry analyst Danny Sullivan, the phrase "Search engine optimization" probably came into use in 1997.
Sullivan credits Bruce Clay as one of the first people to popularize the term.[7] On May 2, 2007,[8] Jason Gambert attempted to trademark the term SEO by convincing the Trademark Office in Arizona[9] that SEO is a "process" involving manipulation of keywords and not a "marketing service." Early versions of search algorithms relied on webmaster-provided information such as the keyword meta tag or index files in engines like ALIWEB.
Meta tags provide a guide to each page's content.
Using metadata to index pages was found to be less than reliable, however, because the webmaster's choice of keywords in the meta tag could potentially be an inaccurate representation of the site's actual content.
Inaccurate, incomplete, and inconsistent data in meta tags could and did cause pages to rank for irrelevant searches.[10][dubious – discuss] Web content providers also manipulated some attributes within the HTML source of a page in an attempt to rank well in search engines.[11] By 1997, search engine designers recognized that webmasters were making efforts to rank well in their search engine, and that some webmasters were even manipulating their rankings in search results by stuffing pages with excessive or irrelevant keywords.
Early search engines, such as Altavista and Infoseek, adjusted their algorithms to prevent webmasters from manipulating rankings.[12] By relying so much on factors such as keyword density which were exclusively within a webmaster's control, early search engines suffered from abuse and ranking manipulation.
To provide better results to their users, search engines had to adapt to ensure their results pages showed the most relevant search results, rather than unrelated pages stuffed with numerous keywords by unscrupulous webmasters.
This meant moving away from heavy reliance on term density to a more holistic process for scoring semantic signals.[13] Since the success and popularity of a search engine is determined by its ability to produce the most relevant results to any given search, poor quality or irrelevant search results could lead users to find other search sources.
Search engines responded by developing more complex ranking algorithms, taking into account additional factors that were more difficult for webmasters to manipulate.
In 2005, an annual conference, AIRWeb (Adversarial Information Retrieval on the Web), was created to bring together practitioners and researchers concerned with Search engine optimization and related topics.[14] Companies that employ overly aggressive techniques can get their client websites banned from the search results.
In 2005, the Wall Street Journal reported on a company, Traffic Power, which allegedly used high-risk techniques and failed to disclose those risks to its clients.[15] Wired magazine reported that the same company sued blogger and SEO Aaron Wall for writing about the ban.[16] Google's Matt Cutts later confirmed that Google did in fact ban Traffic Power and some of its clients.[17] Some search engines have also reached out to the SEO industry, and are frequent sponsors and guests at SEO conferences, webchats, and seminars.
Major search engines provide information and guidelines to help with website optimization.[18][19] Google has a Sitemaps program to help webmasters learn if Google is having any problems indexing their website and also provides data on Google traffic to the website.[20] Bing Webmaster Tools provides a way for webmasters to submit a sitemap and web feeds, allows users to determine the "crawl rate", and track the web pages index status.
In 2015, it was reported that Google was developing and promoting mobile search as a key feature within future products.
In response, many brands began to take a different approach to their Internet marketing strategies.[21] In 1998, two graduate students at Stanford University, Larry Page and Sergey Brin, developed "Backrub", a search engine that relied on a mathematical algorithm to rate the prominence of web pages.
The number calculated by the algorithm, PageRank, is a function of the quantity and strength of inbound links.[22] PageRank estimates the likelihood that a given page will be reached by a web user who randomly surfs the web, and follows links from one page to another.
In effect, this means that some links are stronger than others, as a higher PageRank page is more likely to be reached by the random web surfer.
Page and Brin founded Google in 1998.[23] Google attracted a loyal following among the growing number of Internet users, who liked its simple design.[24] Off-page factors (such as PageRank and hyperlink analysis) were considered as well as on-page factors (such as keyword frequency, meta tags, headings, links and site structure) to enable Google to avoid the kind of manipulation seen in search engines that only considered on-page factors for their rankings.
Although PageRank was more difficult to game, webmasters had already developed link building tools and schemes to influence the Inktomi search engine, and these methods proved similarly applicable to gaming PageRank.
Many sites focused on exchanging, buying, and selling links, often on a massive scale.
Some of these schemes, or link farms, involved the creation of thousands of sites for the sole purpose of link spamming.[25] By 2004, search engines had incorporated a wide range of undisclosed factors in their ranking algorithms to reduce the impact of link manipulation.
In June 2007, The New York Times' Saul Hansell stated Google ranks sites using more than 200 different signals.[26] The leading search engines, Google, Bing, and Yahoo, do not disclose the algorithms they use to rank pages.
Some SEO practitioners have studied different approaches to Search engine optimization, and have shared their personal opinions.[27] Patents related to search engines can provide information to better understand search engines.[28] In 2005, Google began personalizing search results for each user.
Depending on their history of previous searches, Google crafted results for logged in users.[29] In 2007, Google announced a campaign against paid links that transfer PageRank.[30] On June 15, 2009, Google disclosed that they had taken measures to mitigate the effects of PageRank sculpting by use of the nofollow attribute on links.
Matt Cutts, a well-known software engineer at Google, announced that Google Bot would no longer treat any nofollow links, in the same way, to prevent SEO service providers from using nofollow for PageRank sculpting.[31] As a result of this change the usage of nofollow led to evaporation of PageRank.
In order to avoid the above, SEO engineers developed alternative techniques that replace nofollowed tags with obfuscated JavaScript and thus permit PageRank sculpting.
Additionally several solutions have been suggested that include the usage of iframes, Flash and JavaScript.[32] In December 2009, Google announced it would be using the web search history of all its users in order to populate search results.[33] On June 8, 2010 a new web indexing system called Google Caffeine was announced.
Designed to allow users to find news results, forum posts and other content much sooner after publishing than before, Google Caffeine was a change to the way Google updated its index in order to make things show up quicker on Google than before.
According to Carrie Grimes, the software engineer who announced Caffeine for Google, "Caffeine provides 50 percent fresher results for web searches than our last index..."[34] Google Instant, real-time-search, was introduced in late 2010 in an attempt to make search results more timely and relevant.
Historically site administrators have spent months or even years optimizing a website to increase search rankings.
With the growth in popularity of social media sites and blogs the leading engines made changes to their algorithms to allow fresh content to rank quickly within the search results.[35] In February 2011, Google announced the Panda update, which penalizes websites containing content duplicated from other websites and sources.
Historically websites have copied content from one another and benefited in search engine rankings by engaging in this practice.
However, Google implemented a new system which punishes sites whose content is not unique.[36] The 2012 Google Penguin attempted to penalize websites that used manipulative techniques to improve their rankings on the search engine.[37] Although Google Penguin has been presented as an algorithm aimed at fighting web spam, it really focuses on spammy links[38] by gauging the quality of the sites the links are coming from.
The 2013 Google Hummingbird update featured an algorithm change designed to improve Google's natural language processing and semantic understanding of web pages.
Hummingbird's language processing system falls under the newly recognized term of "conversational search" where the system pays more attention to each word in the query in order to better match the pages to the meaning of the query rather than a few words.[39] With regards to the changes made to Search engine optimization, for content publishers and writers, Hummingbird is intended to resolve issues by getting rid of irrelevant content and spam, allowing Google to produce high-quality content and rely on them to be 'trusted' authors.
In October 2019, Google announced they would start applying BERT models for English language search queries in the US.
Bidirectional Encoder Representations from Transformers (BERT) was another attempt by Google to improve their natural language processing but this time in order to better understand the search queries of their users.[40] In terms of Search engine optimization, BERT intended to connect users more easily to relevant content and increase the quality of traffic coming to websites that are ranking in the Search Engine Results Page.
The leading search engines, such as Google, Bing and Yahoo!, use crawlers to find pages for their algorithmic search results.
Pages that are linked from other search engine indexed pages do not need to be submitted because they are found automatically.
The Yahoo! Directory and DMOZ, two major directories which closed in 2014 and 2017 respectively, both required manual submission and human editorial review.[41] Google offers Google Search Console, for which an XML Sitemap feed can be created and submitted for free to ensure that all pages are found, especially pages that are not discoverable by automatically following links[42] in addition to their URL submission console.[43] Yahoo! formerly operated a paid submission service that guaranteed crawling for a cost per click;[44] however, this practice was discontinued in 2009.
Search engine crawlers may look at a number of different factors when crawling a site.
Not every page is indexed by the search engines.
The distance of pages from the root directory of a site may also be a factor in whether or not pages get crawled.[45] Today, most people are searching on Google using a mobile device.[46] In November 2016, Google announced a major change to the way crawling websites and started to make their index mobile-first, which means the mobile version of a given website becomes the starting point for what Google includes in their index.[47] In May 2019, Google updated the rendering engine of their crawler to be the latest version of Chromium (74 at the time of the announcement).
Google indicated that they would regularly update the Chromium rendering engine to the latest version.[48] In December 2019, Google began updating the User-Agent string of their crawler to reflect the latest Chrome version used by their rendering service.
The delay was to allow webmasters time to update their code that responded to particular bot User-Agent strings.
Google ran evaluations and felt confident the impact would be minor.[49] To avoid undesirable content in the search indexes, webmasters can instruct spiders not to crawl certain files or directories through the standard robots.txt file in the root directory of the domain.
Additionally, a page can be explicitly excluded from a search engine's database by using a meta tag specific to robots (usually <meta name="robots" content="noindex"> ).
When a search engine visits a site, the robots.txt located in the root directory is the first file crawled.
The robots.txt file is then parsed and will instruct the robot as to which pages are not to be crawled.
As a search engine crawler may keep a cached copy of this file, it may on occasion crawl pages a webmaster does not wish crawled.
Pages typically prevented from being crawled include login specific pages such as shopping carts and user-specific content such as search results from internal searches.
In March 2007, Google warned webmasters that they should prevent indexing of internal search results because those pages are considered search spam.[50] A variety of methods can increase the prominence of a webpage within the search results.
Cross linking between pages of the same website to provide more links to important pages may improve its visibility.[51] Writing content that includes frequently searched keyword phrase, so as to be relevant to a wide variety of search queries will tend to increase traffic.[51] Updating content so as to keep search engines crawling back frequently can give additional weight to a site.
Adding relevant keywords to a web page's metadata, including the title tag and meta description, will tend to improve the relevancy of a site's search listings, thus increasing traffic.
URL canonicalization of web pages accessible via multiple URLs, using the canonical link element[52] or via 301 redirects can help make sure links to different versions of the URL all count towards the page's link popularity score.
SEO techniques can be classified into two broad categories: techniques that search engine companies recommend as part of good design ("white hat"), and those techniques of which search engines do not approve ("black hat").
The search engines attempt to minimize the effect of the latter, among them spamdexing.
Industry commentators have classified these methods, and the practitioners who employ them, as either white hat SEO, or black hat SEO.[53] White hats tend to produce results that last a long time, whereas black hats anticipate that their sites may eventually be banned either temporarily or permanently once the search engines discover what they are doing.[54] An SEO technique is considered white hat if it conforms to the search engines' guidelines and involves no deception.
As the search engine guidelines[18][19][55] are not written as a series of rules or commandments, this is an important distinction to note.
White hat SEO is not just about following guidelines but is about ensuring that the content a search engine indexes and subsequently ranks is the same content a user will see.
White hat advice is generally summed up as creating content for users, not for search engines, and then making that content easily accessible to the online "spider" algorithms, rather than attempting to trick the algorithm from its intended purpose.
White hat SEO is in many ways similar to web development that promotes accessibility,[56] although the two are not identical.
Black hat SEO attempts to improve rankings in ways that are disapproved of by the search engines, or involve deception.
One black hat technique uses hidden text, either as text colored similar to the background, in an invisible div, or positioned off screen.
Another method gives a different page depending on whether the page is being requested by a human visitor or a search engine, a technique known as cloaking.
Another category sometimes used is grey hat SEO.
This is in between black hat and white hat approaches, where the methods employed avoid the site being penalized but do not act in producing the best content for users.
Grey hat SEO is entirely focused on improving search engine rankings.
Search engines may penalize sites they discover using black or grey hat methods, either by reducing their rankings or eliminating their listings from their databases altogether.
Such penalties can be applied either automatically by the search engines' algorithms, or by a manual site review.
One example was the February 2006 Google removal of both BMW Germany and Ricoh Germany for use of deceptive practices.[57] Both companies, however, quickly apologized, fixed the offending pages, and were restored to Google's search engine results page.[58] SEO is not an appropriate strategy for every website, and other Internet marketing strategies can be more effective, such as paid advertising through pay per click (PPC) campaigns, depending on the site operator's goals.
Search engine marketing (SEM) is the practice of designing, running and optimizing search engine ad campaigns.[59] Its difference from SEO is most simply depicted as the difference between paid and unpaid priority ranking in search results.
Its purpose regards prominence more so than relevance; website developers should regard SEM with the utmost importance with consideration to visibility as most navigate to the primary listings of their search.[60] A successful Internet marketing campaign may also depend upon building high quality web pages to engage and persuade, setting up analytics programs to enable site owners to measure results, and improving a site's conversion rate.[61] In November 2015, Google released a full 160 page version of its Search Quality Rating Guidelines to the public,[62] which revealed a shift in their focus towards "usefulness" and mobile search.
In recent years the mobile market has exploded, overtaking the use of desktops, as shown in by StatCounter in October 2016 where they analyzed 2.5 million websites and found that 51.3% of the pages were loaded by a mobile device.[63] Google has been one of the companies that are utilizing the popularity of mobile usage by encouraging websites to use their Google Search Console, the Mobile-Friendly Test, which allows companies to measure up their website to the search engine results and how user-friendly it is.
SEO may generate an adequate return on investment.
However, search engines are not paid for organic search traffic, their algorithms change, and there are no guarantees of continued referrals.
Due to this lack of guarantees and certainty, a business that relies heavily on search engine traffic can suffer major losses if the search engines stop sending visitors.[64] Search engines can change their algorithms, impacting a website's placement, possibly resulting in a serious loss of traffic.
According to Google's CEO, Eric Schmidt, in 2010, Google made over 500 algorithm changes – almost 1.5 per day.[65] It is considered a wise business practice for website operators to liberate themselves from dependence on search engine traffic.[66] In addition to accessibility in terms of web crawlers (addressed above), user web accessibility has become increasingly important for SEO.
Optimization techniques are highly tuned to the dominant search engines in the target market.
The search engines' market shares vary from market to market, as does competition.
In 2003, Danny Sullivan stated that Google represented about 75% of all searches.[67] In markets outside the United States, Google's share is often larger, and Google remains the dominant search engine worldwide as of 2007.[68] As of 2006, Google had an 85–90% market share in Germany.[69] While there were hundreds of SEO firms in the US at that time, there were only about five in Germany.[69] As of June 2008, the market share of Google in the UK was close to 90% according to Hitwise.[70] That market share is achieved in a number of countries.
As of 2009, there are only a few large markets where Google is not the leading search engine.
In most cases, when Google is not leading in a given market, it is lagging behind a local player.
The most notable example markets are China, Japan, South Korea, Russia and the Czech Republic where respectively Baidu, Yahoo! Japan, Naver, Yandex and Seznam are market leaders.
Successful search optimization for international markets may require professional translation of web pages, registration of a domain name with a top level domain in the target market, and web hosting that provides a local IP address.
Otherwise, the fundamental elements of search optimization are essentially the same, regardless of language.[69] On October 17, 2002, SearchKing filed suit in the United States District Court, Western District of Oklahoma, against the search engine Google.
SearchKing's claim was that Google's tactics to prevent spamdexing constituted a tortious interference with contractual relations.
On May 27, 2003, the court granted Google's motion to dismiss the complaint because SearchKing "failed to state a claim upon which relief may be granted."[71][72] In March 2006, KinderStart filed a lawsuit against Google over search engine rankings.
KinderStart's website was removed from Google's index prior to the lawsuit, and the amount of traffic to the site dropped by 70%.
On March 16, 2007, the United States District Court for the Northern District of California (San Jose Division) dismissed KinderStart's complaint without leave to amend, and partially granted Google's motion for Rule 11 sanctions against KinderStart's attorney, requiring him to pay part of Google's legal expenses.[73][74]
Search engine marketing (SEM) is a form of Internet marketing that involves the promotion of websites by increasing their visibility in search engine results pages (SERPs) primarily through paid advertising.[1] SEM may incorporate search engine optimization (SEO), which adjusts or rewrites website content and site architecture to achieve a higher ranking in search engine results pages to enhance pay per click (PPC) listings.[2] In 2007, U.S.
advertisers spent US $24.6 billion on Search engine marketing.[3] In Q2 2015, Google (73.7%) and the Yahoo/Bing (26.3%) partnership accounted for almost 100% of U.S.
search engine spend.[4] As of 2006, SEM was growing much faster than traditional advertising and even other channels of online marketing.[5] Managing search campaigns is either done directly with the SEM vendor or through an SEM tool provider.
It may also be self-serve or through an advertising agency.
As of October 2016, Google leads the global search engine market with a market share of 89.3%.
Bing comes second with a market share of 4.36%, Yahoo comes third with a market share of 3.3%, and Chinese search engine Baidu is fourth globally with a share of about 0.68%.[6] As the number of sites on the Web increased in the mid-to-late 1990s, search engines started appearing to help people find information quickly.
Search engines developed business models to finance their services, such as pay per click programs offered by Open Text[7] in 1996 and then Goto.com[8] in 1998.
Goto.com later changed its name[9] to Overture in 2001, was purchased by Yahoo! in 2003, and now offers paid search opportunities for advertisers through Yahoo! Search Marketing.
Google also began to offer advertisements on search results pages in 2000 through the Google AdWords program.
By 2007, pay-per-click programs proved to be primary moneymakers[10] for search engines.
In a market dominated by Google, in 2009 Yahoo! and Microsoft announced the intention to forge an alliance.
The Yahoo! & Microsoft Search Alliance eventually received approval from regulators in the US and Europe in February 2010.[11] Search engine optimization consultants expanded their offerings to help businesses learn about and use the advertising opportunities offered by search engines, and new agencies focusing primarily upon marketing and advertising through search engines emerged.
The term "Search engine marketing" was popularized by Danny Sullivan in 2001[12] to cover the spectrum of activities involved in performing SEO, managing paid listings at the search engines, submitting sites to directories, and developing online marketing strategies for businesses, organizations, and individuals.
Search engine marketing uses at least five methods and metrics to optimize websites.[citation needed] Search engine marketing is a way to create and edit a website so that search engines rank it higher than other pages.
It should be also focused on keyword marketing or pay-per-click advertising (PPC).
The technology enables advertisers to bid on specific keywords or phrases and ensures ads appear with the results of search engines.
With the development of this system, the price is growing under a high level of competition.
Many advertisers prefer to expand their activities, including increasing search engines and adding more keywords.
The more advertisers are willing to pay for clicks, the higher the ranking for advertising, which leads to higher traffic.[15] PPC comes at a cost.
The higher position is likely to cost $5 for a given keyword, and $4.50 for a third location.
A third advertiser earns 10% less than the top advertiser while reducing traffic by 50%.[15] Investors must consider their return on investment when engaging in PPC campaigns.
Buying traffic via PPC will deliver a positive ROI when the total cost-per-click for a single conversion remains below the profit margin.
That way the amount of money spent to generate revenue is below the actual revenue generated.
There are many reasons explaining why advertisers choose the SEM strategy.
First, creating a SEM account is easy and can build traffic quickly based on the degree of competition.
The shopper who uses the search engine to find information tends to trust and focus on the links showed in the results pages.
However, a large number of online sellers do not buy search engine optimization to obtain higher ranking lists of search results but prefer paid links.
A growing number of online publishers are allowing search engines such as Google to crawl content on their pages and place relevant ads on it.[16] From an online seller's point of view, this is an extension of the payment settlement and an additional incentive to invest in paid advertising projects.
Therefore, it is virtually impossible for advertisers with limited budgets to maintain the highest rankings in the increasingly competitive search market.
Google's Search engine marketing is one of the western world's marketing leaders, while its Search engine marketing is its biggest source of profit.[17] Google's search engine providers are clearly ahead of the Yahoo and Bing network.
The display of unknown search results is free, while advertisers are willing to pay for each click of the ad in the sponsored search results.
Paid inclusion involves a search engine company charging fees for the inclusion of a website in their results pages.
Also known as sponsored listings, paid inclusion products are provided by most search engine companies either in the main results area or as a separately identified advertising area.
The fee structure is both a filter against superfluous submissions and a revenue generator.
Typically, the fee covers an annual subscription for one webpage, which will automatically be catalogued on a regular basis.
However, some companies are experimenting with non-subscription based fee structures where purchased listings are displayed permanently.
A per-click fee may also apply.
Each search engine is different.
Some sites allow only paid inclusion, although these have had little success.
More frequently, many search engines, like Yahoo!,[18] mix paid inclusion (per-page and per-click fee) with results from web crawling.
Others, like Google (and as of 2006, Ask.com[19][20]), do not let webmasters pay to be in their search engine listing (advertisements are shown separately and labeled as such).
Some detractors of paid inclusion allege that it causes searches to return results based more on the economic standing of the interests of a web site, and less on the relevancy of that site to end-users.
Often the line between pay per click advertising and paid inclusion is debatable.
Some have lobbied for any paid listings to be labeled as an advertisement, while defenders insist they are not actually ads since the webmasters do not control the content of the listing, its ranking, or even whether it is shown to any users.
Another advantage of paid inclusion is that it allows site owners to specify particular schedules for crawling pages.
In the general case, one has no control as to when their page will be crawled or added to a search engine index.
Paid inclusion proves to be particularly useful for cases where pages are dynamically generated and frequently modified.
Paid inclusion is a Search engine marketing method in itself, but also a tool of search engine optimization since experts and firms can test out different approaches to improving ranking and see the results often within a couple of days, instead of waiting weeks or months.
Knowledge gained this way can be used to optimize other web pages, without paying the search engine company.
SEM is the wider discipline that incorporates SEO.
SEM includes both paid search results (using tools like Google Adwords or Bing Ads, formerly known as Microsoft adCenter) and organic search results (SEO).
SEM uses paid advertising with AdWords or Bing Ads, pay per click (particularly beneficial for local providers as it enables potential consumers to contact a company directly with one click), article submissions, advertising and making sure SEO has been done.
A keyword analysis is performed for both SEO and SEM, but not necessarily at the same time.
SEM and SEO both need to be monitored and updated frequently to reflect evolving best practices.
In some contexts, the term SEM is used exclusively to mean pay per click advertising,[2] particularly in the commercial advertising and marketing communities which have a vested interest in this narrow definition.
Such usage excludes the wider search marketing community that is engaged in other forms of SEM such as search engine optimization and search retargeting.
Creating the link between SEO and PPC represents an integral part of the SEM concept.
Sometimes, especially when separate teams work on SEO and PPC and the efforts are not synced, positive results of aligning their strategies can be lost.
The aim of both SEO and PPC is maximizing the visibility in search and thus, their actions to achieve it should be centrally coordinated.
Both teams can benefit from setting shared goals and combined metrics, evaluating data together to determine future strategy or discuss which of the tools works better to get the traffic for selected keywords in the national and local search results.
Thanks to this, the search visibility can be increased along with optimizing both conversions and costs.[21] Another part of SEM is social media marketing (SMM).
SMM is a type of marketing that involves exploiting social media to influence consumers that one company’s products and/or services are valuable.[22] Some of the latest theoretical advances include Search engine marketing management (SEMM).
SEMM relates to activities including SEO but focuses on return on investment (ROI) management instead of relevant traffic building (as is the case of mainstream SEO).
SEMM also integrates organic SEO, trying to achieve top ranking without using paid means to achieve it, and pay per click SEO.
For example, some of the attention is placed on the web page layout design and how content and information is displayed to the website visitor.
SEO & SEM are two pillars of one marketing job and they both run side by side to produce much better results than focusing on only one pillar.
Paid search advertising has not been without controversy and the issue of how search engines present advertising on their search result pages has been the target of a series of studies and reports[23][24][25] by Consumer Reports WebWatch.
The Federal Trade Commission (FTC) also issued a letter[26] in 2002 about the importance of disclosure of paid advertising on search engines, in response to a complaint from Commercial Alert, a consumer advocacy group with ties to Ralph Nader.
Another ethical controversy associated with search marketing has been the issue of trademark infringement.
The debate as to whether third parties should have the right to bid on their competitors' brand names has been underway for years.
In 2009 Google changed their policy, which formerly prohibited these tactics, allowing 3rd parties to bid on branded terms as long as their landing page in fact provides information on the trademarked term.[27] Though the policy has been changed this continues to be a source of heated debate.[28] On April 24, 2012, many started to see that Google has started to penalize companies that are buying links for the purpose of passing off the rank.
The Google Update was called Penguin.
Since then, there have been several different Penguin/Panda updates rolled out by Google.
SEM has, however, nothing to do with link buying and focuses on organic SEO and PPC management.
As of October 20, 2014, Google had released three official revisions of their Penguin Update.
In 2013, the Tenth Circuit Court of Appeals held in Lens.com, Inc.
v.
1-800 Contacts, Inc.
that online contact lens seller Lens.com did not commit trademark infringement when it purchased search advertisements using competitor 1-800 Contacts' federally registered 1800 CONTACTS trademark as a keyword.
In August 2016, the Federal Trade Commission filed an administrative complaint against 1-800 Contacts alleging, among other things, that its trademark enforcement practices in the Search engine marketing space have unreasonably restrained competition in violation of the FTC Act.
1-800 Contacts has denied all wrongdoing and appeared before an FTC administrative law judge in April 2017.[29] AdWords is recognized as a web-based advertising utensil since it adopts keywords that can deliver adverts explicitly to web users looking for information in respect to a certain product or service.
It is flexible and provides customizable options like Ad Extensions, access to non-search sites, leveraging the display network to help increase brand awareness.
The project hinges on cost per click (CPC) pricing where the maximum cost per day for the campaign can be chosen, thus the payment of the service only applies if the advert has been clicked.
SEM companies have embarked on AdWords projects as a way to publicize their SEM and SEO services.
One of the most successful approaches to the strategy of this project was to focus on making sure that PPC advertising funds were prudently invested.
Moreover, SEM companies have described AdWords as a practical tool for increasing a consumer’s investment earnings on Internet advertising.
The use of conversion tracking and Google Analytics tools was deemed to be practical for presenting to clients the performance of their canvas from click to conversion.
AdWords project has enabled SEM companies to train their clients on the utensil and delivers better performance to the canvass.
The assistance of AdWord canvass could contribute to the growth of web traffic for a number of its consumer’s websites, by as much as 250% in only nine months.[30] Another way Search engine marketing is managed is by contextual advertising.
Here marketers place ads on other sites or portals that carry information relevant to their products so that the ads jump into the circle of vision of browsers who are seeking information from those sites.
A successful SEM plan is the approach to capture the relationships amongst information searchers, businesses, and search engines.
Search engines were not important to some industries in the past, but over the past years the use of search engines for accessing information has become vital to increase business opportunities.[31] The use of SEM strategic tools for businesses such as tourism can attract potential consumers to view their products, but it could also pose various challenges.[32] These challenges could be the competition that companies face amongst their industry and other sources of information that could draw the attention of online consumers.[31] To assist the combat of challenges, the main objective for businesses applying SEM is to improve and maintain their ranking as high as possible on SERPs so that they can gain visibility.
Therefore, search engines are adjusting and developing algorithms and the shifting criteria by which web pages are ranked sequentially to combat against search engine misuse and spamming, and to supply the most relevant information to searchers.[31] This could enhance the relationship amongst information searchers, businesses, and search engines by understanding the strategies of marketing to attract business.
2DaMax Marketing | (800) 564-4898
Contact Us Today!
(800) 564-4898