Responsive Business & Ecommerce Theme

You can change this text in Slider One settings tab of theme options page. Write something awesome to make your website ridiculously fabulous.

Continue Reading

A Comprehensive Analysis of the New Domain Authority

Moz’s Domain Authority is requested over 1,000,000,000 times per year, it’s referenced millions of times on the web, and it has become a veritable household name among search engine optimizers for a variety of use cases, from determining the success of a link building campaign to qualifying domains for purchase. With the launch of Moz’s entirely new, improved, and much larger link index, we recognized the opportunity to revisit Domain Authority with the same rigor as we did keyword volume years ago (which ushered in the era of clickstream-modeled keyword data).

What follows is a rigorous treatment of the new Domain Authority metric. What I will not do in this piece is rehash the debate over whether Domain Authority matters or what its proper use cases are. I have and will address those at length in a later post. Rather, I intend to spend the following paragraphs addressing the new Domain Authority metric from multiple directions.

Correlations between DA and SERP rankings

The most important component of Domain Authority is how well it correlates with search results. But first, let’s get the correlation-versus-causation objection out of the way: Domain Authority does not cause search rankings. It is not a ranking factor. Domain Authority predicts the likelihood that one domain will outrank another. That being said, its usefulness as a metric is tied in large part to this value. The stronger the correlation, the more valuable Domain Authority is for predicting rankings.

Methodology

Determining the “correlation” between a metric and SERP rankings has been accomplished in many different ways over the years. Should we compare against the “true first page,” top 10, top 20, top 50 or top 100? How many SERPs do we need to collect in order for our results to be statistically significant? It’s important that I outline the methodology for reproducibility and for any comments or concerns on the techniques used. For the purposes of this study, I chose to use the “true first page.” This means that the SERPs were collected using only the keyword with no additional parameters. I chose to use this particular data set for a number of reasons:

  • The true first page is what most users experience, thus the predictive power of Domain Authority will be focused on what users see.
  • By not using any special parameters, we’re likely to get Google’s typical results.
  • By not extending beyond the true first page, we’re likely to avoid manually penalized sites (which can impact the correlations with links.)
  • We did NOT use the same training set or training set size as we did for this correlation study. That is to say, we trained on the top 10 but are reporting correlations on the true first page. This prevents us from the potential of having a result overly biased towards our model.

I randomly selected 16,000 keywords from the United States keyword corpus for Keyword Explorer. I then collected the true first page for all of these keywords (completely different from those used in the training set.) I extracted the URLs but I also chose to remove duplicate domains (ie: if the same domain occurred, one after another.) For a length of time, Google used to cluster domains together in the SERPs under certain circumstances. It was easy to spot these clusters, as the second and later listings were indented. No such indentations are present any longer, but we can’t be certain that Google never groups domains. If they do group domains, it would throw off the correlation because it’s the grouping and not the traditional link-based algorithm doing the work.

I collected the Domain Authority (Moz), Citation Flow and Trust Flow (Majestic), and Domain Rank (Ahrefs) for each domain and calculated the mean Spearman correlation coefficient for each SERP. I then averaged the coefficients for each metric.

Outcome

Moz’s new Domain Authority has the strongest correlations with SERPs of the competing strength-of-domain link-based metrics in the industry. The sign (-/+) has been inverted in the graph for readability, although the actual coefficients are negative (and should be).

A Comprehensive Analysis of the New Domain Authority

A Comprehensive Analysis of the New Domain Authority

Moz’s Domain Authority scored a ~.12, or roughly 6% stronger than the next best competitor (Domain Rank by Ahrefs.) Domain Authority performed 35% better than CitationFlow and 18% better than TrustFlow. This isn’t surprising, in that Domain Authority is trained to predict rankings while our competitor’s strength-of-domain metrics are not. It shouldn’t be taken as a negative that our competitors strength-of-domain metrics don’t correlate as strongly as Moz’s Domain Authority — rather, it’s simply exemplary of the intrinsic differences between the metrics. That being said, if you want a metric that best predicts rankings at the domain level, Domain Authority is that metric.

Note: At first blush, Domain Authority’s improvements over the competition are, frankly, underwhelming. The truth is that we could quite easily increase the correlation further, but doing so would risk over-fitting and compromising a secondary goal of Domain Authority…

Handling link manipulation

Historically, Domain Authority has focused on only one single feature: maximizing the predictive capacity of the metric. All we wanted were the highest correlations. However, Domain Authority has become, for better or worse, synonymous with “domain value” in many sectors, such as among link buyers and domainers. Subsequently, as bizarre as it may sound, Domain Authority has itself been targeted for spam in order to bolster the score and sell at a higher price. While these crude link manipulation techniques didn’t work so well in Google, they were sufficient to increase Domain Authority. We decided to rein that in.

Data sets

The first thing we did was compile a series off data sets that corresponded with industries we wished to impact, knowing that Domain Authority was regularly manipulated in these circles.

  • Random domains
  • Moz customers
  • Blog comment spam
  • Low-quality auction domains
  • Mid-quality auction domains
  • High-quality auction domains
  • Known link sellers
  • Known link buyers
  • Domainer network
  • Link network

While it would be my preference to release all the data sets, I’ve chosen not to in order to not “out” any website in particular. Instead, I opted to provide these data sets to a number of search engine marketers for validation. The only data set not offered for outside validation was Moz customers, for obvious reasons.

Methodology

For each of the above data sets, I collected both the old and new Domain Authority scores. This was conducted all on February 28th in order to have parity for all tests. I then calculated the relative difference between the old DA and new DA within each group. Finally, I compared the various data set results against one another to confirm that the model addresses the various methods of inflating Domain Authority.

Results

A Comprehensive Analysis of the New Domain Authority

A Comprehensive Analysis of the New Domain Authority

In the above graph, blue represents the Old Average Domain Authority for that data set and orange represents the New Average Domain Authority for that same data set. One immediately noticeable feature is that every category drops. Even random domains drops. This is a re-centering of the Domain Authority score and should cause no alarm to webmasters. There is, on average, a 6% reduction in Domain Authority for randomly selected domains from the web. Thus, if your Domain Authority drops a few points, you are well within the range of normal. Now, let’s look at the various data sets individually.

A Comprehensive Analysis of the New Domain Authority

A Comprehensive Analysis of the New Domain Authority

Random domains: -6.1%

Using the same methodology of finding random domains which we use for collecting comparative link statistics, I selected 1,000 domains, we were able to determine that there is, on average, a 6.1% drop in Domain Authority. It’s important that webmasters recognize this, as the shift is likely to affect most sites and is nothing to worry about.

Moz customers: -7.4%

Of immediate interest to Moz is how our own customers perform in relation to the random set of domains. On average, the Domain Authority of Moz customers lowered by 7.4%. This is very close to the random set of URLs and indicates that most Moz customers are likely not using techniques to manipulate DA to any large degree. 

Link buyers: -15.9%

Surprisingly, link buyers only lost 15.9% of their Domain Authority. In retrospect, this seems reasonable. First, we looked specifically at link buyers from blog networks, which aren’t as spammy as many other techniques. Second, most of the sites paying for links are also optimizing their site’s content, which means the sites do rank, sometimes quite well, in Google. Because Domain Authority trains against actual rankings, it’s reasonable to expect that the link buyers data set would not be impacted as highly as other techniques because the neural network learns that some link buying patterns actually work.

Comment spammers: -34%

Here’s where the fun starts. The neural network behind Domain Authority was able to drop comment spammers’ average DA by 34%. I was particularly pleased with this one because of all the types of link manipulation addressed by Domain Authority, comment spam is, in my honest opinion, no better than vandalism. Hopefully this will have a positive impact on decreasing comment spam — every little bit counts.

Link sellers: -56%

I was actually quite surprised, at first, that link sellers on average dropped 56% in Domain Authority. I knew that link sellers often participated in link schemes (normally interlinking their own blog networks to build up DA) so that they can charge higher prices. However, it didn’t occur to me that link sellers would be easier to pick out because they explicitly do not optimize their own sites beyond links. Subsequently, link sellers tend to have inflated, bogus link profiles and flimsy content, which means they tend to not rank in Google. If they don’t rank, then the neural network behind Domain Authority is likely to pick up on the trend. It will be interesting to see how the market responds to such a dramatic change in Domain Authority.

High-quality auction domains: -61%

One of the features that I’m most proud of in regards to Domain Authority is that it effectively addressed link manipulation in order of our intuition regarding quality. I created three different data sets out of one larger data set (auction domains), where I used certain qualifiers like price, TLD, and archive.org status to label each domain as high-quality, mid-quality, and low-quality. In theory, if the neural network does its job correctly, we should see the high-quality domains impacted the least and the low-quality domains impacted the most. This is the exact pattern which was rendered by the new model. High-quality auction domains dropped an average of 61% in Domain Authority. That seems really high for “high-quality” auction domains, but even a cursory glance at the backlink profiles of domains that are up for sale in the $10K+ range shows clear link manipulation. The domainer industry, especially the domainer-for-SEO industry, is rife with spam.

Link network: -79%

There is one network on the web that troubles me more than any other. I won’t name it, but it’s particularly pernicious because the sites in this network all link to the top 1,000,000 sites on the web. If your site is in the top 1,000,000 on the web, you’ll likely see hundreds of root linking domains from this network no matter which link index you look at (Moz, Majestic, or Ahrefs). You can imagine my delight to see that it drops roughly 79% in Domain Authority, and rightfully so, as the vast majority of these sites have been banned by Google.

Mid-quality auction domains: -95%

Continuing with the pattern regarding the quality of auction domains, you can see that “mid-quality” auction domains dropped nearly 95% in Domain Authority. This is huge. Bear in mind that these drastic drops are not combined with losses in correlation with SERPs; rather, the neural network is learning to distinguish between backlink profiles far more effectively, separating the wheat from the chaff.

Domainer networks: -97%

If you spend any time looking at dropped domains, you have probably come upon a domainer network where there are a series of sites enumerated and all linking to one another. For example, the first site might be sbt001.com, then sbt002.com, and so on and so forth for thousands of domains. While it’s obvious for humans to look at this and see a pattern, Domain Authority needed to learn that these techniques do not correlate with rankings. The new Domain Authority does just that, dropping the domainer networks we analyzed on average by 97%.

Low-quality auction domains: -98%

Finally, the worst offenders — low-quality auction domains — dropped 98% on average. Domain Authority just can’t be fooled in the way it has in the past. You have to acquire good links in the right proportions (in accordance with a natural model and sites that already rank) if you wish to have a strong Domain Authority score.

What does this mean?

For most webmasters, this means very little. Your Domain Authority might drop a little bit, but so will your competitors’. For search engine optimizers, especially consultants and agencies, it means quite a bit. The inventories of known link sellers will probably diminish dramatically overnight. High DA links will become far more rare. The same is true of those trying to construct private blog networks (PBNs). Of course, Domain Authority doesn’t cause rankings so it won’t impact your current rank, but it should give consultants and agencies a much smarter metric for assessing quality.

What are the best use cases for DA?

  • Compare changes in your Domain Authority with your competitors. If you drop significantly more, or increase significantly more, it could indicate that there are important differences in your link profile.
  • Compare changes in your Domain Authority over time. The new Domain Authority will update historically as well, so you can track your DA. If your DA is decreasing over time, especially relative to your competitors, you probably need to get started on outreach.
  • Assess link quality when looking to acquire dropped or auction domains. Those looking to acquire dropped or auction domains now have a much more powerful tool in their hands for assessing quality. Of course, DA should not be the primary metric for assessing the quality of a link or a domain, but it certainly should be in every webmaster’s toolkit.

What should we expect going forward?

We aren’t going to rest. An important philosophical shift has taken place at Moz with regards to Domain Authority. In the past, we believed it was best to keep Domain Authority static, rarely updating the model, in order to give users an apples-to-apples comparison. Over time, though, this meant that Domain Authority would become less relevant. Given the rapidity with which Google updates its results and algorithms, the new Domain Authority will be far more agile as we give it new features, retrain it more frequently, and respond to algorithmic changes from Google. We hope you like it.


Be sure to join us on Thursday, March 14th at 10am PT at our upcoming webinar discussing strategies & use cases for the new Domain Authority:

Save my spot

Source

Ultimate SEO”Ultimate SEO”

Backlinks 101 – SEO’s Off-page Often Ignored Power Ranker

First off a little disclosure this article over laps the Backlinks category of the FAQ.

What Are Backlinks?

Backlinks are links from other sites.  Think of them as votes of affirmation.  Only one vote can come from a domain so for SEO purposes it doesn’t matter if there are 100 or 1 link from the same domain one link is the count you gain.  Subdomains are viewed separately…thats why yourblog.tumblr.com isn’t a tumblr backlink.  Now those other links may increase traffic to your site but in regards to SEO value its one vote.  Some call this metric “Domain Pop” … how many different domains link to a site.  It’s also gotten more complicated as people would host multiple sites on a shared IP.  How many backlinks come from different IPs is “IP Pop.”  It’s common to see a little higher domain pop than ip pop but if its a huge gap its suspect.

The more domains that link to you the more authoritative you must be right?  Well Kinda.  If 1000 domains link to your site you likely are more authoritative than a site that 3 sites link too.  Not all domains or votes or backlinks….are the same.  A link to your site from UltimateSEO.org carries with it the weigh attributed to that site by its backlinks.  People refer to this as “link juice” basically the backlinks coming in to a site fuel the backlinks leaving a site.

Link juice prevents someone of registering 10 new domains and making 10 backlinks to their original site because those ten new sites probably lack link juice from their own backlinks.  Generally said though backlinks increase a site’s domain athority or citation flow.  Different companies refer to the authority of a site differently.  Beyond DA and CF there is also LIS but I have found DA to be the best singular indicator of a sites worth.

You can see a site’s backlinks from many indexes, most are paid.  Ultimate SEO recommends Monitor Backlinks if you want a tool that is really good at backlinks.  UltimateSEO received nothing for that endorsement.  The endorsement or vote …. as you see it’s a backlink.

What Kinds Of Backlinks Are There?

No-Follow vs Do-Follow Backlinks

No-Follow vs Do-Follow Backlinks

Beyond saying Good and Bad there are actually a couple to be aware of “follow” or “do-follow” and “nofollow.”  They get their names by the instruction they give search engine crawlers…no follow links mean don’t follow this to that site.  In theory a regular link is a follow link and serves as the backlink you ultimately want.

Some reason folks went a little crazy and no-followed everything … even internal links.  No-follows were meant to combat link building schemes such as blog comments.  While its fine to have no follow links to your site there should be a limited mixture of them in relation to your actual do-follow links.  No-follow links are still indexed and I feel strongly have some SEO value still even if its just to drive traffic to your site.  In the end you want do-follow links because they come as full fledged votes for your site, where as no-follows are more or less saying “here is this link to a sight I don’t want to be associated with necessarily.”

It’s that distance that makes “no-follows” a poor source of SEO efforts and its why you should use them sparingly in internal links.  Why would you send a signal to Google that you don’t stand behind an internal link to yourself?  Some try to hold onto all the link juice coming in and no-follow” every external link, this is a poor practice and its been shown that linking your content to relevant good external content helps you.

No-Follow Internal Links…Just Don’t

I rarely use no-follow links, I kinda think they system there is broken so I just follow them all.  Sometime ago people started no-following everything and lead their link juice to specific pages they wanted to rank.  Since this was a misuse of the no-follow Google changed how it handled no-follows, it doesn’t keep the juice in your site or on a page it just disappears.  No-follows take the same amount of link juice as a regular link but no one gets it.  Pointless then right?  We’ll debate that more another day.

What Is Anchor Text?

Anchor text is the “keyword” of a backlink.  Ultimate SEO for example is anchor text for the link https://ultimateseo.org which that link had no anchor text.  Anchor text defines the backlink vote.  If enough people make a link to your site like “Miserable Failure” it will teach Google that the target of that link is a miserable failure.  This happen to George W Bush’s White House biography page long ago and is called a “Google Bomb”.  It’s that old saying…if you say something often enough it can become true.

How Many Do I Need?

A lot.  You need as many as you can get from as many places as you can get them.  Just keep in mind that a backlink from my personal site isn’t as powerful as a backlink from the CDC.gov website …. they have the authority.  Thats also why .edu and .gov backlinks are especially coveted.

Backlinks

Backlinks

A quick rule of thumb to determine how many you need is to simply Google the keyword you are attempting to rank for….lets say “cool music from the 60s.” I get pastemagazine.com leading the pack.  According the SEMrush.com that site has 2.7 million backlinks coming from 36,700 domains on 44,300 IP addresses.  So roughly keep that your target if you want to rank #1 for “cool music from the 60s.”

How Do I Get Backlinks?

Many ways….the Gods honest truth is to do it the obvious way …. by having content worth linking too.  If you want a page to rank at the top you need a site thats fast, optimized on page, has relevant … awesome content … and people will backlink to you.  Over years and years and years and you’ll need to keep that content better than everyone else’s … thats not super realistic though.  Sometimes the best content is on page 2 and it’ll stay there…I often Google something and skip the first things just cause they are often just the most SEOed things.  BUT most people by far pick the first result, then the second and so on.

What Are Some Popular Link Building Techniques?

So you need to prime the pump and simulate organic growth and popularity and now you’re in a link building scheme.  Some are looked down upon more than others but make no mistake any attempt to gain backlinks is a link building scheme.  Press Releases, Guest Blogging, Commenting in Forums, making profiles on other sites, link swapping, selling or buying links and finally PBNs.  PBNs are private blog networks where you make zombie sites that link to your important site … but considering that the example above had 37,000 domains linking to it how effective is a network of say 200 sites?  Well surprisingly effective…and thats why Google hunts PBNs like Buffy the vampire slayer.

Thats our Backlinks 101…I’ll talk more about some of these concepts in future posts.

Ultimate SEO”Ultimate SEO”

Adwords Template With Search Console, Google Analytics In Data Studio

Adwords Template With Search Console, Google Analytics In Data Studio

SEO & PPC Data Studio Report Using Adwords, Google Analytics and Google Search Console All-In-One Template

Google Data Studio Reports are some fun things.  Here at Ultimate SEO you love visualizations and thats partially why we like Data Studio. Beyond the looks its also integrated easily with Google Sheets, Google Analytics and Search Console to name a few. These few though create a powerful free SEO PPC tool.

You can check out the report directly by clicking the link above, here is an embedded look at the nine pages of live data thats basically always right.  It’s nice to be able to pull in data from two very different Google tools.  Lots of people know of Google Analytics and think it covers Google Search Console but it doesn’t and I’ll discuss that more in another post but the unique data from these sources can all mix to form one handy live report.

You can check out all the information pulled here in this report and change the dates as needed using the drop down.  To personalize the report to your own site simply copy it and set the data sources to your own Google Analytics and Search Console sources.  Word of caution on the Search Console aspect there are two connections, one is the site and the other I believe is the page urls.  So make sure to connect those correctly.  Just like in electrical work it’s like to like.

Across these nine pages you’ll find insights into any site with an Adwords campaign including keywords, search terms, CTR and CPC.

Ultimate SEO”Ultimate SEO”

MAJOR GOOGLE SEO CHANGE FOR SOME: Website Traffic CREDITED To Where Google Chooses

Wednesday, February 06, 2019

In Search Console, the Performance report currently credits all page metrics to the exact URL that the user is referred to by Google Search. Although this provides very specific data, it makes property management more difficult; for example: if your site has mobile and desktop versions on different properties, you must open multiple properties to see all your Search data for the same piece of content.

To help unify your data, Search Console will soon begin assigning search metrics to the (Google-selected) canonical URL, rather than the URL referred to by Google Search. This change has several benefits:

  • It unifies all search metrics for a single piece of content into a single URL: the canonical URL. This shows you the full picture about a specific piece of content in one property.
  • For users with separate mobile or AMP pages, it unifies all (or most, since some mobile URLs may end up as canonical) of your data to a single property (the “canonical” property).
  • It improves the usability of the AMP and Mobile-Friendly reports. These reports currently show issues in the canonical page property, but show the impression in the property that owns the actual URL referred to by Google Search. After this change, the impressions and issues will be shown in the same property.

Google Search Console

Google Search Console

When will this happen?

We plan to transition all performance data on April 10, 2019. In order to provide continuity to your data, we will pre-populate your unified data beginning from January 2018. We will also enable you to view both old and new versions for a few weeks during the transition to see the impact and understand the differences.

API and Data Studio users: The Search Console API will change to canonical data on April 10, 2019.

How will this affect my data?

  • At an individual URL level, you will see traffic shift from any non-canonical (duplicate) URLs to the canonical URL.
  • At the property level, you will see data from your alternate property (for example, your mobile site) shifted to your “canonical property”. Your alternate property traffic probably won’t drop to zero in Search Console because canonicalization is at the page, not the property level, and your mobile property might have some canonical pages. However, for most users, most property-level data will shift to one property. AMP property traffic will drop to zero in most cases (except for self-canonical pages).
  • You will still be able to filter data by device, search appearance (such as AMP), country, and other dimensions without losing important information about your traffic.

You can see some examples of these traffic changes below.

Preparing for the change

  • Consider whether you need to change user access to your various properties; for example: do you need to add new users to your canonical property, or do existing users continue to need access to the non-canonical properties.
  • Modify any custom traffic reports you might have created in order to adapt for this traffic shift.
  • If you need to learn the canonical URL for a given URL, you can use the URL Inspection tool.
  • If you want to save your traffic data calculated using the current system, you should download your data using either the Performance report’s Export Data button, or using the Search Console API.

Examples

Here are a few examples showing how data might change on your site. In these examples, you can see how your traffic numbers would change between a canonical site (called example.com) and alternate site (called m.example.com).

Important: In these examples, the desktop site contains all the canonical pages and the mobile contains all the alternate pages. In the real world, your desktop site might contain some alternate pages and your mobile site might contain some canonical pages. You can determine the canonical for a given URL using the URL Inspection tool.

Total traffic

In the current version, some of your traffic is attributed to the canonical property and some to the alternate property. The new version should attribute all of your traffic to the canonical property.

MAJOR GOOGLE SEO CHANGE FOR SOME: Website Traffic CREDITED To Where Google Chooses

MAJOR GOOGLE SEO CHANGE FOR SOME: Website Traffic CREDITED To Where Google Chooses

Individual page traffic

You can see traffic changes between the duplicate and canonical URLs for individual pages in the Pages view. The next example shows how traffic that used to be split between the canonical and alternate pages are now all attributed to the canonical URL:

Mobile traffic

In the current version, all of your mobile traffic was attributed to your m. property. The new version attributes all traffic to your canonical property when you apply the “Device: Mobile” filter as shown here:

In conclusion

We know that this change might seem a little confusing at first, but we’re confident that it will simplify your job of tracking traffic data for your site. If you have any questions or concerns, please reach out on the Webmaster Help Foru

Ultimate SEO”Ultimate SEO”

Bad Backlinks: 100 Sites You Don’t Want A Backlink From.

Bad Backlinks

UltimateSEO.org has backlinks from about a thousand domains.  In a recent review of these I found an odd reoccurring link from multiple domains but all with the same content and titles.  I was introduced with “The Globe” which charges sites to NOT list them or makes money from SEOs paying them to not backlink to them.  At $36 a link they’re likely insane and I bet its bringing in some money.  But before we go all crazy and start paying Ransomlinks (if its not a word I claim it … Ransomlinks are backlinks from bad sites meant to lower your SEO score unless you pay to not be linked too.)

In reviewing the situation I ran across a list of the most disavowed sites.  I figured Id share that with you below, but before I do what outcome did I choose for these bad links pointed to my site?

  1. Option 1 Pay: Heck No! Then the terrorists win.
  2. Disavow: No! Don’t use disavow unless Google has placed a manual action against your site.  I’m skeptical anyhow of the tools purpose and Google itself says there is no need to use the tool unless you’ve been penalized and told by them you are being penalized.
  3. Do Nothing: Yes! Don’t do anything. Google likely knows about the Ransomlinks scheme and has already penalized the site by deindexing it.  There are so many random domains its going to be a mess to address so let it be unless you have a seen a negative affect.  In other words…before you saw your leg off wondering if that spot is cancer…stop and find out.
  4. An idea: 301 Redirect Them…seriously…all of these links point to a subdomain that until now hasn’t existed.  Most others who are talking about this site note a similar subdomain targeted.   I could create the targeted subdomain and redirect all links to it from my site back to theirs.  🙂  

I’m opting for the third as I dont have any indication that Google cares about these Ransomlinks.  They may actually bring some random traffic of use so redirecting them would take that from my site.

What do would you do with “Ransomlinks”

And now the most disavowed sites…

Most popular websites disavowed by webmasters

1 blogspot.com
2 blogspot.ca
3 blogspot.co.uk
4 ning.com
5 wordpress.com
6 blog.pl
7 linkarena.com
8 yuku.com
9 blogspot.de
10 webs.com
11 blogspot.nl
12 blogspot.fr
13 lemondir.com
14 blog.com
15 alonv.com
16 tistory.com
17 searchatlarge.com
18 dvpdvp1.com
19 typepad.com
20 nju-jp.com
21 bluehost.com
22 wldirectory.com
23 tumblr.com
24 hyperboards.com
25 directoryfuse.com
26 prlog.ru
27 informe.com
28 ligginit.com
29 theglobe.org
30 pulsitemeter.com
31 articlerich.com
32 weebly.com
33 the-globe.com
34 blogspot.no
35 theglobe.net
36 articledashboard.com
37 dig.do
38 seodigger.com
39 cybo.com
40 fat64.net
41 bravenet.com
42 cxteaw.com
43 askives.com
44 mrwhatis.net
45 insanejournal.com
46 xurt.com
47 freedirectorysubmit.com
48 commandresults.com
49 sagauto.com
50 internetwebgallery.com
51 freewebsitedirectory.com
52 ewbnewyork.com
53 000webhost.com
54 tblog.com
55 directorylist.me
56 analogrhythm.com
57 snapcc.org
58 bravejournal.com
59 weblinkstoday.com
60 m-pacthouston.com
61 linkcruncher.com
62 tripod.com
63 cogizz.com
64 niresource.com
65 over-blog.com
66 ogdenscore.com
67 free-link-directory.info
68 alikewebsites.com
69 folkd.com
70 djsonuts.com
71 uia.biz
72 bangkokprep.com
73 forumsland.com
74 punbb-hosting.com
75 hostmonster.com
76 blogspot.in
77 siteslikesearch.com
78 bookmark4you.com
79 siliconvalleynotary.com
80 listablog.com
81 poetic-dictionary.com
82 linkspurt.com
83 cultuurtechnologie.net
84 azjournos.com
85 exteen.com
86 articletrader.com
87 blogspot.com.au
88 delphistaff.com
89 altervista.org
90 media-tourism.com
91 woodwardatelier.com
92 holdtiteadhesives.com
93 lorinbrownonline.com
94 tech4on.com
95 popyourmovie.com
96 trilogygroveland.com
97 foqe.net
98 directorybin.com
99 eatrightkc.com

52 Tools And Counting: Mostly Free SEO Tools – I Actually Use

There are actually a couple lists of tools on this page.  Check them out and come back for more in the future.  What you wont find in this list is a tool that has asked or paid or communicated with me concerning the list.  This is an honest collection of what I want to keep links too.  If you want to be on the list feel free to maybe comment?  If I find it useful and use it I’ll add a link.  Otherwise maybe someone else will like it from your comment link and they can make there own list. 🙂

seo tool

seo tool

Over 50 SEO Tools, Mostly Free

As I go through my SEO day I type the same addresses over and over to get to a collection of useful tools.  So I decide to post them as links once and for all, for my own benefit but also for others who may come here for a backlink indexer but find there isn’t one … but there is a list that includes one of the best.

A quick highlight of some of the tools and how I plan to build on this simple list.  First, I’ll address the second thing I noted…I plan to build the list out from its initial 52 to hopefully a hundred with the promise I’m not just adding crap tools or duplicate tools over and over.  Now as a highlight of some of the best tools on this list I’ll put some details in the coming weeks in a section below the list. There is also another mini list that includes tools for text and html below the main list.

More Tools – Text and HTML Free Tools

I’ve turned 40 today so I figured I’d make a top 40 list of tools and guides. The first one I’ll note I didn’t use for Web Design, I just used it to make a choice for me.

Generate Random Choice – When you’re on the fence and need some decision help.

Random Word Generator – Brainstorming

Adwords And Titles

Capitalize the First letter of Each Word In A Sentence

Free Photoshop Like Web Tool

Pixlr

Site Speed

Compress HTML Code

Uncompress HTML Code

CSS Minifier

Merge CSS OR JS Files

Testing

CSS HTML JavaScript With Output Test Environment

Fast Web Design

Hexadecimal Color Picker

JavaScript Popup Maker

HTML List Generator

Convert URLs to HTML Links

CSS

CSS Beautifier

HTML Modifiers

HTML Formatter

Convert line breaks to paragraphs

Remove Duplicate Lines

CSV

CSV to XML Converter

Text Format And Order Modifiers

Alphabetical Order Tool

Remove Line Breaks from Text

Date And Time

Epoch Timestamp To Date

Time zone list – HTML select snippet

Trace Route Network Tool, Only On A Map

Tracert is a command thats elementary to networking and computers.  Trace Route or Tracert does exactly what it sounds like, and its useful cause it tells ya every ip address it passes through between the server and the catcher (not technical terms there).  It explains where speed issues are in a global perspective or in your home.

Its usually just text but https://www.monitis.com/traceroute/ made it more fun…and from this map I can see why my fiber connection isnt seemingly very fast tonight, I’m being routed through London, England to do a domestic “hop” (hops are each leg of a journey in a tracert.

tracert in SEO

The post Trace Route Network Tool, Only On A Map appeared first on Ultimate SEO.

Citation Factor, Trust Factor And How They Relate To Domain Authority

They actually don’t relate to Domain Authority.  Domain Authority is a metric designed by Moz to help replace Google’s Page Rank which used to be made available to the public but has since been discontinued or at least the public portion is no longer available.  Now Moz isn’t the only company offering a replacement to Page Rank, Majestic came up with Citation Factor and Trust Factor.  You’d expect these two still somewhat relate to each other as they in theory are meant to portray the same thing but they rarely do.  This leaves it up to you to decide which metric to use to determine a site’s ability to rank a page on Google.

This is a second part to the Domain Authority post I recently made.

Moz And Majestic Disagree A lot

This is worth a separate post and it will be but for now just noting some random stats on domains in this spreadsheet do you see DA CF or TF rising and falling together?  BTW this spreadsheet is so unSEO friendly and will likely never get ranked for mobile, so I may have to wipe it out of a mobile version which is a shame.

For instance notice two domains are a DA 29 … the CT and TF are totally different.  Or DA 46 We have 6 all right next to each other, the CT ranges from 31 to 0 and TF ranges from 0 to 10 so no correlation on these really.  So now we know they are supposed to express the same thing but wildly disagree…what are Citation Flow and Trust Flow then?

Citation Flow

Citation Flow

Citation Flow is Majestic’s attempt at forecasting the influence of a site…what its capable of making folks do or believe.  This has nothing to do with how valid the argument or evidence is just the ability to be noticed.  I like to think of the Kardashians, Id argue they have influence and they can get noticed.  That being said I’m not saying they should be or that what they believe or think is valid or worthy of listening too.

I honestly don’t use this metric, I largely feel it irrelevant.  Its okay to watch…ever seen the movie Airplane! … Citation Flow is the guy who says nothing useful but is funny.

Trust Flow

How authoritative and trust worthy is the content they produce is the question Trust Flow tries to address.  Back to the Kardashians … I would assume that they have influence but trust in what they say isn’t all that great.  They are not experts at much.  Looking up one of their “personal” sites we have a TF of 21.  21 of 100 isn’t huge, its kinda high when we think of all the sites out there but I tossed in Neil Patel and Moz (TF competitor site) and they did well.  Patel 51 and 62 to Moz.  Which I can believe that Kim’s site is a 21 to those other sites being more than double in trustworthy content.

Some sites are used as seed trustworthy sites.  So lets say harvard.edu is trustworthy so we’ll give them a 100 ( not saying they have a 100 ) then who they link to we give a 90.  Who the 90s link too we give an 80, TF is kinda how far away a site is linked to a known trusted source.

I do personally use this metric but I give a lot of credit to very low numbers.  So If something is a 8 or 10 in TF I consider them at the start of trustworthiness like a 15 is average…seeing Moz at 61 is likely near God’s level of trustworthiness.

So that’s a high level look at Citation Flow ( Basically Useless ) and Trust Flow ( Generally Good ) and how they relate to Domain Authority ( no relation at all)

They’re competing products by two companies attempting to guess what Google thinks of your site…and they aren’t 100% accurate…

How I Use Metrics To Determine SEO Value

I use them as a mixture of fact and fiction, I look at DA first and foremost.  I’ve never seen a DA 0 With a CF 100 and TF 100 and if I did I wouldn’t have thought “that’s a good site”.  So DA is the first thing that attracts my attention.  I’ve seen plenty of DA 50 with a CF 14 and TF 12 and thought, this one is worth more review.

A secondary indicator of SEO value to me is then TF but I lower the bar on TF, I’ll make another post on this later, how I use these numbers to determine SEO value.

Related posts:

What Would Google Cloud Need To Do To Catch Up To AWS And Azure?

Matt Leffler
Matt Leffler, I host another complete web stack on GCP

Ease of use is one thing. I have clients that can benefit from cloud based solutions. They may have a typical hosting account with some name brand hosts but for various reasons I find myself getting annoyed and deciding I’m moving them to the cloud.

I move them to Google Cloud Platform but only temporarily because I know that when I leave them (I’m just a freelancer) they’ll be lost. If they decide they want to be in the cloud every single time I then move them to AWS. Why did I pick Google to begin with? Only one reason…the credit. That credit allows me to move them without them feeling pressured but when they make the decision to stick with the cloud they leave Google.

Why Is Google Cloud Not Where I Leave Them?

  1. Ever try to have a server on GCP with more than one IP address? – It is next to impossible compared with the ease of associating multiple IPs to an AWS server. Beyond the technical limitations of GCP they also don’t allow more than one IP address per zone. So I would have to use a load balancer if I wanted more than one server in the same zone. That second layer of steps exceeds the clients interest.
  2. Often they have WHM / cPanel servers and they require 2 IP addresses for the two required name servers. Google Cloud … I have to make the WHM with one IP then run off to another zone and make a second server thats DNS only but still meets the minimum requirements of cPanel….then I have to make a DNS cluster tieing the two together and I have found these clusters to be very tricky…often somehow falling apart and needing to be put back together. The client in the end just wanted one server. I can do that in about 2 hours total on AWS without additional servers.
  3. Restrictions – So lets say I make a GCP WordPress micro server. I let people signup and require them to respond to an email from the server. NO ONE ever responds to the email…well, its because GCP by default blocks all SMTP ports. So, if I want something as simple as email I have to go setup a mail relay through another service and guess how many clients eyes glaze over at this point? All.
  4. Too much reliance on Cloud SDK over simply putting it in the website. Again back to clients being able to use this after I leave them, if they cant do it in the web interface they cant do it. Too often the instructions in the help reference Cloud SDK and thats the end. If a client requires a full time cloud specialist because Google relies on Cloud SDK too much then I can’t leave an everyday client with them.
  5. Ever tried to SFTP into a server? AWS I pick an SSH key that Ive already added from previous servers and then in my SFTP client I add my connection settings and select the .pem file and I’m in. Google cloud there are just way more steps and issues. Here is a fun example … go to google and type “Google Cloud SSH” and see what Google suggests you are searching for…Google Cloud SSH Key, Google Cloud SSH Not Working, Google Cloud SSH Permissions, Google Cloud SSH Timeout …. that says volumes. Permission denied is basically the expected search from Google if you type a simple thing you will have to be able to do.

So those are my thoughts…its overly complex and drops the ball in an area that only a few people in most companies IT departments can understand.


I added this into UltimateSEO.org because it IS an SEO thing.  The speed and loading of your site has a big effect on your ranking.  I have improved a sites ranking from the average keyword in the 40s to the 20s by simply putting them on their own server.  Who wouldn’t want their keywords jumping like that and for about $30 a month.

UltimateSEO: Make An Impact

What Is Domain Authority Why Should I Care?

Domain Authority is in my opinion the single greatest indicator of a sites ability to compete organically for search traffic.  It is a great question and its simply complex.  Keep in mind Domain Authority or DA as we shall refer to it pretty much here on is the single best number to predict ranking ability and we know that the top 3 positions consume over half of all searches so it can be directly attached to traffic.

A few disclaimers: Domain Authority is not a Google creation it is from Moz.com and it is their educated guess at how Google see’s your authority.  But it is not Google’s so remember that.  Secondly I am going to go over this in a higher level and use some half truths for simplicity.  If you want the whole truth just read the articles or Google the topics discussed for greater detail.  Finally, most of what I will show you are other people’s guesses as to what goes into Moz’s Domain Authority which again is a guess about Google’s metric on a site.  Google doesn’t tell us exactly and they shouldn’t, Moz doesn’t tell us either because they want you to pay for it.  Moz isn’t the only one guessing, a lot of folks guess and some include Majestic’s Citation Factor and Trust FactorSEOProfiler has it’s LIS.  For simplicity and since I’m the one running this show we’ll focus for now on Domain Authority by Moz.

Now it’s a simple number from 1 to 100 but it isn’t like a ladder with even steps, as you climb the steps get bigger and it takes more to gain a level.  So from DA 1 lets say you gain 5 backlinks and two are from sites that have a decent DA 40 and Moz bumps your score up to a DA 3.  If you gain 5 more links to your site and they also have similar metrics as the last bunch you might assume you’re set to have a DA 6, but you get surprised when it is reported as a 4.  Another set of 5 and it stays a 4.  DA levels are harder to improve the higher you get your site.

What Determines Domain Authority?

Links?

Domain authority is the best metric because it provides us with a single number but its based on some rather key and diverse factors.  Yes, it is backlinks but its so much more than that…but it is largely backlinks or incoming links from other domains.  These serve as citations, they’re references from other entities that acknowledge publicly that this site they are link too, yours is authoritative.  Coupled with “anchor text” or the words that make up a link that aren’t the address they can define a site.  More on that towards the end.  This is anchor text it has a link but the anchor text serves as the vanity description of the content at the other end of the link.  The link comes from another site to yours and depending upon that site’s authority the link is more or less powerful than another.

domain-authorityWe’ll talk in greater depth about SEO Backlinks but for now let’s just understand that backlinks are votes from other sites.  Not all backlinks are created equal.  The more backlinks the more authority, and one vote per site.  So 800 links from another site you own are not worth 25 from 25 sites.

In the graphic to the left links are the blue AND red slices of the pie and together they make up 49%.  The next biggest element is SOCIAL MEDIA MENTIONS.  I do that in all caps because I’ve always included social media as an element of reports I offer in client proposals and I get told often they don’t care about all that.  I just sit back and scribble down that they don’t know what they don’t know yet.

Social Media Mentions

Sure you need an account and you need followers but to get those followers you need to post regularly.  Five or Six times randomly during the year isn’t regular.  Thats why so many people feel social media doesn’t matter because they were unable to tap into the great interest on Facebook in their widgets.   News flash no one is really going to be interested in your widget but you and a handful of individuals.  Thats why your messages need to have more than just “buy our widget.”  If you are a church for instance a weekly devotional or prayer list is better than telling people to come on Sunday week after week.  The biggest mistake is not including a link to your site in your post.  While there may be a link off of your profile page to your site your message can be repeated by others and each repeat or share is from their profile. It may mention you and likely link back to your profile but that is still one link to your site.  Include a link.

social media mention with link

Domain Age Matters And It Should

So you’re going out there and you found a domain name thats available and you want to outrank your competition in a month.  SEOs laugh out loud around the globe…Highly unlikely, it’s partly you lack street cred.  I’d love to rank number one in the keyword Louisville SEO but I don’t.  I’m the 89.8th result.  MatthewLeffler.com is 352 days old as I am writing this.  The top result for Louisville SEO is 9 years and 207 days old.  Indirectly they’ve got 9 years worth of content, returning traffic and backlinks than I have and so they are given a bonus for that longevity.  Partly to prevent searchers from getting a completely different result every time a new domain picks a keyword.   Its easier though as you age, the difference between 5 yrs and 9 years is trivial and the distance between 1 day and 1 year is a stretch.  Sometimes buying an existing domain thats relevant to your business will give you a way around this, but you’re at the mercy of what is for sale and like a foster kid you don’t know about it’s past.

Brand Search Volume

Nobody said SEO was fair

Sometimes in SEO it feels like you’re swimming with Joan Crawford.  This one I feel is a bit much … brand search volume is how often people are looking for you.  Well isn’t that impossible until they know about me and the reason they don’t know about me is because you have me on page 4?  Couldn’t brand search volume arguably be covered in social media mentions and backlinks?  We’ll now that we got those out of the way, nobody ever said life was fair, Tina.

I’ve decided that I’ll figure out a way to sneak a reference to a favorite movie of mine into every post from here on.

But I digress, so your brand search volume does make sense to be included because people searching for Coca Cola do not want to see Pepsi.  I’m from deep Coke territory and even Taco Bell has stopped asking if I want a Pepsi.  So has Google it appears.

Incidentally I would suspect that local plays into this heavily.  Google knows where the searches are coming from so they understand the foot print of your business.  Which is why Googling the best landscaper in town is different in each town.  Food for thought if you build a regional powerhouse and expand into a new city.

Domain Authority Case Study – Project No Wire Hangers

Sorry I wont make this all about that movie but we all know …. NO Wire Hangers.  For our purposes hangers are domains and they can be wire or wooden.  Depending on the hanger or domain your attempts to rank for a keyword will be affected, like the clothes you hang on them.  I’ll show you several domains stats that I have built completely alone over several months.  It hasn’t been a dedicated effort so these numbers could have been better but the effort has been the same.  Some are older than others some are the same age. We’ll dissect these domains and metrics extensively.

Matt’s Project Hangers

URL Moz DA Moz PA MozRank External Links
seandelahanty.com 29 / 100 27 / 100 2.70 112
cloud502.com 28 / 100 24 / 100 2.40 134
votelouisville.com 23 / 100 19 / 100 1.90 4
matthewleffler.com 23 / 100 23 / 100 2.30 112
data502.com 22 / 100 21 / 100 2.10 12
matt2.info 20 / 100 16 / 100 1.60 51
seandelahanty.org 18 / 100 19 / 100 1.90 36
forbrent.com 16 / 100 11 / 100 1.10 0
countyattorney.info 14 / 100 10 / 100 1.00 0
seodata.cloud 12 / 100 16 / 100 1.60 11
chriscoffman.rocks 8 / 100 6 / 100 0.60 4

Feel free to look into them and make some assumptions.  You may find tools that I have discussed previously useful in “auditing” the domains.  Much like a Sciencetologist audits a person with toys and wild claims so does your SEO and look we have just about as many acronyms.  Now go and let your Thetan figure out where the keywords were in the graph.  Spoiler … anchor text that you had no control over … officially.  Here is a further reading on the power of anchor text…in 2004 “Miserable Failure” was George W. Bush’s Whitehouse biography page and no one optimized it for that.

Miserable Failure

Related posts:

Web Site SEO Visualizations

Web Site SEO Visualizations
I’m a visual person as I believe most of us are which is why I am fascinated by these images of sites that I have visualized.  Green dots are working pages, blue dots are redirects and orange dots are pages that don’t exist or are forbidden.  The connecting lines represent link structure of a site.  The larger a dot the more traffic it receives.

[…]