So, first of all: How to find good domains?
Good domain is defined by its backlinks. Only backlinks define whether the domain is good or bad.
So we check backlinks and the site history. And not any metrics.
Rule #1 of checking domains:
All Metrics Don’t Matter. No Metric Matters. Only Backlinks Do.
And also the Site History. Whether the site has some spam history, some PBN history, some redirect history, some doorway history, and so on.
But the most important – backlinks.
So, – how do we check backlinks? We take a domain, and check its backlinks. It’s a manual work. There is no automatic way to do it. We can automate, robotize different stages of the process, but not actual manual checking of domain’s backlinks. This is the most important part. And if you delegate this task to robots, to scripts, to automatic programs and software, – here you will fail. Because only a human eye is able to evaluate whether the backlinks are good or not.
But all the preparation work you can entrust, delegate to bots. So here we use SEO metrics. SEO Metrics are the mere sign of – whether it’s worth to spend time checking the domain’s backlinks. Nothing more than that!
The more we rely on SEO metrics, – the less we are accurate, the less good domains we can find, and the more good domains we miss.
And if we don’t trust SEO metrics at all, we miss zero. But this would be very time consuming. To check all 100K daily drops one by one, would take probably a week and your sanity.
So we take some SEO filters, some metric filters, and using those filters we can decide whether it’s worth actually to take the certain domain and to check it. And the tighter filter we have, – the more chance we have that we can miss some good domain and even some great domain. Because often time good domains just don’t have good SEO metrics.
Bottom line, we need to find some balance – how much time we can dedicate to domain checking, daily.
So I choose, for example, to spend two hours daily, and to miss a certain amount of good domains daily. But I don’t regret it, because I still need time for other tasks and jobs, I need time for my life, for my family.
This is basically the first and the most important rule we all should remember. Often times we have these revelations of SEO Gurus, long reads, long blueprints, long PDF’s, and they can even take huge money for those, and they say for example: “You need to take domains with DA 22+, or 15+, or whatever, and a quantity of .edu backlinks should be minimum 3. Or 10. And then your site will be at the 1st position in SERP. In Google.” And then I can exactly say when some of these SEO Gurus reveals, or sales, his new revelations. Because we are starting to get many requests of the same kind: “We need domains with DA 25+ and .edu backlinks 10+.” Why?? What do people want to get with domains chosen by metrics? – Nothing. They don’t know. They just read some SEO Guru stuff.
Don’t be like these people! Remember: Only Backlinks Matter.
So what we should do – is to take our domains from whatever list and to check their backlinks in Ahrefs.
And how do we choose them? By names? To go domain by domain through 100K names? As I said, it could take a week.
So we use some SEO metrics. And here, SEO metrics are our tool, and nothing more. Our tool for filtering domains, to free our time spent on domain research.
There are lot of SEO metrics. Like Ahrefs, Majestic, Moz, SimilarWeb, etc. And there are never ending arguments – which SEO metrics are better, are more relevant for defining whether a domain is good or not.
We find that DA, Domain Authority by Moz, is quite OK for filtering the results; maybe it’s not very relevant to the real domain power, but it’s actually quite good for cutting off bad domains.
I personally love DR, Domain Rank by Ahrefs. It’s very good. But – there are lot of good domains that have low DR, so we cannot really rely on this metric.
We can take endless, limitless number of metrics – we can take all metrics by Moz, all metrics by Ahrefs, all metrics by Majestic, and then we have SimilarWeb, we have Alexa, all other stuff; and if we have like 50 different metrics and put them into the Excel sheet, we have a mess, we have a huge mess that we just cannot operate.
So we need somehow to limit these metrics, because if we have like tens metrics, then our eyes, our sight can’t really take something from there. From another hand, one metric would clearly be not enough, we need to have some broader picture. Just imagine, if we had only DA for all of these. You would have to open every domain and just guess.
So: The more data you have, the more time you save. But not too much. Not too little, but not too big data.
So we have our own set of reliable metrics. We take our domain list and get those metrics for these domains. And then we compile an Excel sheet with this data.
I don’t tell you how to get these metrics, because that’s what everybody decides for themselves. The obvious way is to go and purchase all the subscriptions of all these services – Ahrefs, Majestic, Moz, etc., – but it might be very costly, it can be more than a thousand dollars per month, and just not everybody’s business can afford that. So there are different sites, different tools, different services that gather this data and give it to you for some subscription price or whatever.
Which metrics do we have in our sheet? First of all, Moz DA. Moz metrics are not as relevant, not as powerful as they use to be few years ago, but still, as I said earlier, DA is a very good indicator, and it’s very good for cutting off bad results.
Then we sort results, and in this dataset, I use cut-off by 20. DA 20 is my Floor, my lowest point where I want to spend my time on this domain. And I know I may miss here some domains, because there are still lot of domains with DA like in 15 to 20 range which might be good, but time I would spend on investigation those domains will be huge. So I decide for myself, that I don’t take domains with DA lower than 20. Your personal floor may be different.
Then in order to clear some clatter, we delete some insignificant values. So, another cut-off for me is 24. It’s my personal number of choice, you can choose your own, of course, like 15 and the secondary cut-off 20. I use 20 and 24. So from 24 I just delete the values from the sheet. What does it give to me, it diminishes an amount of clutter. The sheet becomes easier to work with. But here, every domain which doesn’t have DA, I know it’s still 20 to 25. I just don’t get them visible, so they don’t bring distractions to my sight.
Then we have Majestic. And this is our secret, which I can reveal now to you. Actually, everybody who have ever bought domains from us, Domains R Forever, knows it, that we use www for Trust Flow. And I don’t think than anyone else, any other service, does it. And it’s strange. If you read some SEO Gurus, they might say that TF-www doesn’t matter, because it shows only some subdomain rank. And technically it’s true, www is subdomain. So maybe in some sense they are right, for some SEO stuff it’s like this, but for searching of domains TF-www definitely matters. Because if you take a domain and check its metrics, you can very often see that TF-www is much higher than TF, so you can say that most backlinks, while the site was live, lead to its www version, that’s why it’s higher.
Examples of TF-www > TF
So if you don’t have this in your dataset, you will miss actually a lot of domains. I tell you: a lot. And here, with my method, you actually get them.
So this is my second secret. The first secret is: Metrics Don’t Matter. And the second secret: Always use TF-www.
Actually, I don’t know any service that gives you this metric, so you should find it somehow.
And here we also make some cut-off, I take for example 5. A couple years ago TF was much more relevant. If I remember well, it was about 18 or 20, so below 20 it was not worth. But then, Majestic somehow changed their algorithm again, and many good domains fell to 0-5 range. So actually, now 5 is not so bad.
And TF-www we also make from 5.
And then we have another great metric – indexing in Google. Many people are searching for indexed domains. It makes perfect sense, but. Many good domains out there are not indexed but have quite a good metrics, – many of these domains are de-indexed not because of some sanctions, or some bans by Google, but they are de-indexed just simply because they were sitting there idle without any site, for long time. Just because of that they’ve dropped from the index. And in most cases, when you take a good domain and build a good site on it or re-build a site on it, it gets back to index very fast. It’s my secret number 3. In most cases. Not 100% cases, but most cases.
So you should not worry much about domain being de-indexed. Of course, if some other signs are bad, for example in the domain history there was some redirect or stuff, maybe it’s quite toxic. But if everything looks good, and it’s de-indexed – take it. As I said, in most cases it’ll jump back to index soon. If not, you always can file for reconsideration.
It actually depends on what site you do build on this domain. On the quality of this site. There might be different kinds of sites. If you take this domain for building some good site, white hat site, informational site, and so on, with good content, useful for visitors, you can really not worry about zero in the Index cell. You can always send for reconsideration, you can say, “I’m not responsible at all for what was there with this site before me, I just bought this domain.”
But if you build something weaker, something worse, like for example PBN site, it will depend on the quality of that PBN. If you built your PBN on copypasted content, or scraped content, or machine generated content, you obviously cannot send for reconsideration. And if you build a site for example for doorway, you of course cannot send for reconsideration. And if you don’t build a site at all, but use this domain for 301 redirect, – you just cannot take de-indexed domain.
So, this is in a way paradoxical: The better site you build, the less you can worry about its indexing or de-indexing. And vice versa.
But even if the domain is in index, you are not guaranteed in any way that it won’t drop from index. So it’s always a kind of lottery.
And then we have a very important one: the Age. In our case, the age is the first appearance in the web archive. Archive.org works from ’96, and few times I stumbled on sites from ’95. But there are many sites that existed even before ’96 or ’95, and archive.org just don’t have that their history.
It’s an important metric. It’s not an important metric for your SEO, for Google results it doesn’t matter. But it’s an important metric for us, for our decision to look at the domain.
Then we have a set of data from Ahrefs. First of all, as I mentioned, – Domain Rank. For me, it is the most relevant metric among all. But for, so to say, preliminary filtering DA serves better. Still DR is a very important landmark.
Good domains are usually start from DR 20. But many good domains have lower DR, down to even 3. So what we do here, we cut off from 2, and up from 20 we put in color. Actually you can format your sheets as you like, the most important – to make your data visually clear and intuitive.
And then we have another important one – RD, Referring Domains by Ahrefs. You can use Referring Domains index from Majestic or from Ahrefs, they show more or less the same. And here I use obvious cut-off on 20. Even 30-40 often times is not worth it. But sometimes you have domains with really good backlinks, like .gov or .edu backlinks, but they have only like 20 referral domains at all. I would take such domain any day. All the domains below 20 I delete.
And then I also delete all these empty rows below, and now my slider is here, at the bottom. So it’s now much easier to work with this sheet, you can use scrolling.
And then we have some very secondary metrics. We can skip them, but they give some indication. This is Ahrefs .gov and .edu referring domains. They don’t say us much, because all these .gov and .edu backlinks might be a junk. There might be university forum profiles, might be redirects, or whatever.
Also, Ahrefs keywords. This is not an actual Google index, especially for drop domains these keywords are not relevant. But is a nice indicator anyway.
And I don’t take any other metrics. Many people love SimilarWeb. But SimilarWeb metric can be easily inflated. If you take really good domain, you see that SimilarWeb rank is low. But if we take just any domain with low SimilarWeb rank, the chances are the domain is just bad. It’s easy to manipulate it. Much easier than with DA for example, and of course than with DR. The same with Alexa, and many other metrics.
Also many people like to use CF/TF ratio. You can see that I don’t use CF at all. I just don’t like it, I never could see any meaning in it. But this CF/TF ratio can be really good. I just don’t like to have it here in my datasheet, and also with our two variants of TF, with and without www, it can be really difficult, because now we would need to have also two different ratios, with and without www. And there will be lot of visual clutter here.
Some people like to use another ratio – Ahrefs Domain Rank to URL Rank. It’s also good idea, but this set is really enough for me.
And here we have Alexa index. It’s also can be easily manipulated, so if you have low Alexa for some domain, it doesn’t mean that the domain is good. But when you see many good metrics, and Alexa is low as well, you can see that the chances the domain is good are very high.
So I’ve explained you everything we have in our setup.
Then I make the following sorting. First of all, Age from oldest, then Alexa from lowest, then TF from highest, and in the end DA from highest. But you of course may use any order you like. So now, I have my final file I work with.
In the video, I show many examples of how I really work with investigating the domains. It would be difficult to tell it all here, so you might want to watch the video or to read the full transcript; here I’m only able to explain some main points.
So the scheme is like this: we seek in our sheet for domains with consistent metrics;
then we take these domains and check them in:
- Ahrefs for backlinks
- Ahrefs for anchors
- Archive.org for toxic history
- Google for what is there in the index
DA. Very rare you can find really good drop with DA > 50. Because that just doesn’t happen. And if that does happen, you cannot really grab such domains because of course Dropcatch or Snapnames will grab them, and we see there auctions for thousands dollars for such domains. But often times these DA are wrong, because of redirects. If the domain was redirected on the last stages of its life, then its DA might show DA of the target domain, and not of this domain.
Of course, Ahrefs is very expensive to use. If you are on a low budget, you can use some alternatives, like Majestic, which I think is cheaper but for me is not so good, not so handy to work with, and the results output is much less intuitive. For example, in Ahrefs I see all domains sorted by DR and I immediately see if the domain is good or bad (of course then I still check archive etc.):
And in Majestic it’s difficult to understand what you are looking at:
Or, if you are on really low budget, there are some free alternatives, like Openlinkprofiler or Linkpad, which you can start with. And of course then, when only you can scrape some budget, you should buy Ahrefs subscription.
So this is how I check domains. The more domains you check, really every day, the more experience you get, because I can see if the domain is good or not only by looking at the referring domains, but it’s an experience of years.
A never ending discussion: is it okay to take domains that were used for PBN? As for me, if the domain is really good, I would take it for myself, for my projects, I don’t mind PBN. My position: backlinks are great, backlinks are backlinks, backlinks are there, I don’t care what one of the previous owners of the site, of the domain has built on it, what site, I don’t care, I can use it for my projects. But many people don’t like such domains. And I think it’s a kind of common misbelief. Our vast experience tells that PBN history is a very insignificant factor for domain power.
So, as I said, you can find more stories and domain cases in my video. Here I could only highlight some theory. I hope you enjoyed it, and I wish you all lot of success. Would be glad to answer your questions as well.