1.Poor Navigation
If visitors come to your site, and they can’t get around easily because navigation is set up poorly, they won’t engage with your site.
“Poor engagement statistics paired with ‘crawlability’ issues and other technical issues are all indications of low authority” according to Google, says Ashley M. Orndorff, director of marketing, ParadoxLabs. “If your site is considered irrelevant and is not deemed useful to visitors, it’s not going to rank in the search engines. Search engines are businesses too and their business is showing the most relevant resources to their users.” SEO Akola
2.Images That Are Not Search Friendly
Many website designs today emphasize stunning visuals without taking into account how those visuals can adversely impact search rankings, according to Salman Aslam, CMO, Omnicore.
“A lot of people are using images with beautiful fonts and strong colors to make the page appealing, but to Google it’s just an image,” Aslam says. By using a combination of Web fonts, HTML and CSS, it’s possible “to retain the beauty and achieve good SEO by creating all of the text elements within a banner as ‘live text.’”
3.Duplicate Content
Almost all of the SEO professionals we queried cited duplicate content as a top technical concern. Simply put, duplicate content is any content that is “appreciably similar” or exactly the same as content that resides on your site, according to Google Webmaster Tools.
“Google’s crawlers must cover a lot of ground,” says Michael Stricker, U.S. marketing director for SEMrush. “Google can’t consume all of the data, especially when one considers that Google must revisit each page again and again to find changes or new material. Anything that slows Google’s discovery or crawling of the Web is unwelcome. Dynamically created websites that create Web pages on the fly from databases are frequently misconfigured from an SEO point of view. These sites may create lots of pages, or URLs, that contain basically the same content, over and over.”
Other sources of duplicate content include the use of both “plain” and secure protocol URLs (HTTP and HTTPS); no expressed preference for a www.domain.com versus domain.com (without the www); blog tags; and syndicated RSS feeds.
Duplicate content can also result from common content management system (CMS) functionalities, including sorting parameters, according to Johnny Ewton, Web analyst for Delegator.com.
The remedy is to crawl your site looking for duplications and apply “crawl directives” to inform Google of the relative value of multiple URLs, Stricker says. You can use “robots.txt” (a file that allows you to control how Google’s bots crawl and index your public Web pages) to tell Google the specific folders and directories that are not worth crawling.
It’s also a good idea to tell Google which of several URLs to prefer for its index by applying the rel=”canonical” link element to point to the preferred URL. Canonical tags can help with duplicate content issues because they tell search engines that one page is a duplicate of another, as well as which of the duplicate pages to consider the primary one for indexing by Google’s bots, says Scott Benson, founder and president, Benson SEO.
International sites that target multiple countries with content in a variety of languages can also end up with a lot of duplicate content, according to Matt Naeger, executive vice president, digital strategy for Merkle. In this scenario, Naeger recommends using the rel=”alternate” hreflang code within the <head> of every page to identify the geolocation of the content in a similar, but more targeted language. Using IP detection to generate the correct language and default currency for a page is another solution.
A common duplicate content issue occurs when one site has either a URL beginning with “www” or a URL that doesn’t contain “www,” according to Ramon Khan, online marketing director, National Air Warehouse. Thankfully, there’s an easy fix.
“Try to type in your URL with the non-www URL and see if goes to the www version, then try the opposite,” Khan says. “If both work without any one of them redirecting, you are not properly set up. If so, go to your Google Webmaster Tools. Go to Settings and then Site Settings. See if you have specified a version you prefer. If you’re not sure, get a professional to assist you in determining which version to set up and keep using that going forward.”
Similarly, by default most websites end up with multiple versions of the homepage, reached through various URLs. Having multiple versions of the homepage can cause a lot of duplicate content issues, and it means any link equity the site receives is spread across the different URLs, says Colin Cheng, marketing manager of
MintTwist.
You can fix this issue “by choosing one URL that you want as your main URL,” according to Steven Weldler, VP, online marketing for CardCash. “This is entirely a matter of preference but once you choose one, stick with it. All the other URLs should automatically point to the main URL using a 301redirect.”
4.Shady Link Building.
Google’s Penguin updates put sites that engage in questionable link building practices in its crosshairs. Link building can help a business “see significant gains in Web traffic, but it also introduces a level of risk,” according to Justin Anderson, CEO of HKSEO.us.
Unnatural, spam-like, irrelevant, and “black-hat,” or “just plain bad,” backlinks can cause your site to take a hit in the rankings, adds Mike Waller, owner of SEO Zones, Inc. Backlinks today “should be done at a natural pace, should be diverse and varied and look and appear natural,” he says. “The links you place on your site that link out to authority sites is important, and cross-linking between pages is also important to help the Google crawlers get deep into your site.”
If the Penguin updates hurt your site, it may be necessary to disavow questionable backlinks to it, and if necessary, ask webmasters to remove those links to your site.
5.Poor Mobile Experience
If your website offers a poor user experience on smartphones and tablets and is slow to load on mobile devices, visitors will likely click away, upping your site’s bounce rate. “It’s important to make sure your site is lean and loads fast, as that’s important on mobile,” noted Matt Cutts, Google’ head of Web spam, at a search conference in 2013.
Some companies divert users to separate mobile sites, but doing so can cause problems, according to SEMrush’s Stricker. For example, using a mobile subdomain (such as http://m.domain.com) can split your link equity, raise concerns about diverting traffic from the original URL without informing the user and offering options, and increase resource consumption and maintenance, he notes.
“Google can become suspicious of sites that present different content to different devices, depending on how it is accomplished,” Stricker says.
Responsive design (when a website displays automatically and appropriately for both mobile and desktop devices) “provides an experience customized to the device, yet the content is the same for all users,” Stricker says. Thus, it can improve secondary signals that Google takes into account for search rankings, including page visits, time spent on a page, visit duration and bounce rates.
6.Improper Redirects
Sometimes a Web page, or an entire site, needs to be moved to a different URL. A 301 redirect is “the best way to ensure that users and search engines are directed to the correct page,” according to Google.
“If you’ve ever rebuilt your website without enlisting a reputable SEO, then there is a good chance that links and URLs from your old site aren’t properly set up to connect to your new website,” says Ricky Shockley, search marketing specialist for Web Success Agency. “Your Webmaster should set up 301 redirects from old pages to the new ones,” to help users find the old site pages and ensure those pages pass “link juice” to your current site.
You should locate any 404 “not found errors” on your site and use 301s to direct users to the correct pages, adds Greenlane Search Marketing’s Sebald. “Google Webmaster Tools make it easy to find your 404s, so you can 301-redirect them to more relevant pages. On big sites this is a daunting task but a very useful one.”
7.Local Search and Structured Data Markup
If you aren’t taking advantage of local search data or Structured Data Markup, you’re missing big opportunities, Ewton says.
“Google recognizes local search intent better than ever, and sites that ensure they have a presence on all the local search data providers, such as Yelp, Foursquare, Facebook, Bing and Yellow Pages, can see boosts in local searches within their immediate city scope.”
“For local SEO, the most common issue I find with clients is with their location-specific Web pages,” says Jason Squardo, executive vice president of optimization and founding member of ZOG Digital. “Many times, businesses don’t have them at all. By creating a separate Web page for each location, businesses can more effectively leverage local SEO, which is geared toward providing local results to consumers looking for information about local businesses specifically. This is also becoming more important as the number of mobile-device searches continues to increase.”
Taking advantage of Structured Data Markup can help your site’s search results through “rich snippets.” An example of Structured Data Markup at work could be a list of upcoming events at a nightclub, positioned directly underneath the main search result listing for that nightclub. Though rich snippets and Structured Data Markup may not affect search result rankings, they can help improve clickthrough rates, according to Korneitchouk.
8.Messy URLs
CMS and blog platforms sometimes create awkward URLs for new content. For example, you may end up with a page that has “index.php?p=283581” at the end of the URL. Such “messy URLs can hurt your trust and credibility with search engines and users, leading to decreased clickthrough rates,” says Shockley. “Clean up those messy URLs to include a keyword that explains what the page is about, such as ‘dentist.com/dentures.’ Make sure to set up proper 301 redirects on the old URLs.”
“SEO-friendly URLs contain keywords and are easy to read and understand for both search engines and users,” says Mike Mothner, founder and CEO of search marketing agency Wpromote. “Since it’s important to make your site as easy to index and rank as possible, it’s crucial to have SEO friendly URLs.”
Depending on your site’s CMS, you may be able to make “link templates” that can be applied across your entire site, he adds.
9.Too Much Flash
Though Flash is less common on today’s websites, it’s still around — and it can still hinder search engine robots that are trying to index your site’s content.
“If the search engine bots can’t read or understand your website, they’ll have a hard time trying to figure out what to rank it for,” says Sameep Shah, founder of SimpleWebDesign. “The best advice for solving this is to not use Flash or to use it sparingly. There are technologies such as HTML 5, CSS 3 and JavaScript that can be used instead, and if you really need animation, consider using a video.”
10.Slow Page Loads
Page speed is important not only for a good user experience but also for achieving good search rankings. “If your website has many elements on it, such as images, videos, CSS style sheets, JavaScript code and the like, be sure it’s as optimized for speed as possible,” says Oleg Korneitchouk, director of digital marketing and development for SmartSites.
“I recommend running several of your pages through a page-speed test such as GTmetrix and implementing the suggestions they provide,” Korneitchouk says. “Applying even just a couple of their suggestions can shave seconds off your site load time.”
Google also offers Google PageSpeed Tools to help you test and increase your site’s overall speed, according to Delegator.com’s Ewton.
Your Web hosting service could be part of your speed problem. “I’m a firm believer in paying for reliable, secure and fast hosting services,” says SEO Zones Inc.’s Waller. “If you’re doing the shared hosting thing, that is OK starting out. But if you grow or are a large organization, make the transition to a dedicated server. It gives you more control over all aspects of your website, which is what you want anyway.”
You must be logged in to post a comment.