- Ship It Weekly
- Posts
- Crush Rankings in Just 30 Days SEO
Crush Rankings in Just 30 Days SEO
Note: This is a a featured post on my newsletter
For what it's worth, I drive millions of dollars of traffic in the most competitive key terms online, including with new sites, so if you come in here to comment "this won't work" you're wrong. It's hard work, but it's not unreasonably hard. So much so that I decided to call it "SEO is not hard."
SEO is Not Hard — A step-by-step SEO Tutorial for beginners that will get you ranked every single time
SEO In One Day
SEO is simply not as hard as people pretend like it is; you can get 95% of the effort with 5% of the work, and you absolutely do not need to hire a professional SEO to do it, nor will it be hard to start ranking for well-picked key terms.
Of all the channels we’ll be discussing, SEO is the one that there is the most misinformation about. Some of it is subtle, but some of it is widely spread and believed by so-called SEO consultants who actually don’t know what they’re doing.
SEO is very simple, and unless you’re a very large company it’s probably not worth hiring somebody else to do. It’s also something that has a lot of faux veneer around it. Consultants want to make it seem incredibly difficult so that they can charge you a lot, but I'll show you exactly how to do it, step by step, and you'll win.
How Google Works In order to understand what we need to do for SEO let’s look back at how Google started, how it’s evolving today, and develop a groundwork from which we can understand how to get ranked on Google.
First, we're going to reverse engineer what Google is doing, and then simply follow their rules, picking the right keywords, and get your sites ranked.
The Early Days of Google
The idea for PageRank — Google’s early ranking algorithm — stemmed from Einstein. Larry Page and Sergei Brin were students at Stanford, and they noticed how often scientific studies referred to famous papers, such as the theory of relativity. These references acted almost like a vote — the more your work was referenced the more important it must be. If they downloaded every scientific paper and looked at the references, they could theoretically decide which papers were the most important, and rank them.
They realized that because of links, the Internet could be analyzed and ranked in a similar way, except instead of using references they could use links. So they set about attempting to “download” (or crawl) the entire Internet, figuring out which sites were linked to the most. The sites with the most links were, theoretically, the best sites. And if you did a search for “university,” they could look at the pages that talked about “university” and rank them.
On-Page SEO
The first step in getting our site ready to rank is making it clear to Google what our site is about.
For now we’re going to focus our home page (our landing page) on ranking for one keyword that isn’t our brand or company name. Once we do that and get that ranking we can branch out into other keywords and start to dominate the search landscape, but for now we’ll stay laser focused.
Keyword Research The first thing we need to do is to figure out what that keyword is. Depending on how popular our site is and how long it’s been around, the level of traffic and difficulty we’ll get from this effort may vary.
The Long Tail
There’s a concept we need to be familiar with known as the “long tail.”
If we were to graph “popularity” of most things with “popularity” being the Y axis and the rank order being the Y axis, we’d get something like a power law graph:
There are some big hits that get the majority of attention, and after a few hits the graph falls sharply. The long-tail theory says that as we become more diverse as a society the yellow end of the above graph will stretch forever and get taller.
In SEO this matters because, at least in the beginning, we’re going to go after long tail keywords — very exact, intention-driven keywords with lower competition that we know can win, then gradually we’ll work our way to the left.
Our site isn’t going to outrank ultra-competitive keywords in the beginning, but by being more specific we can start winning very targeted traffic with much less effort.
The keywords we’re looking for we will refer to as “long-tail keywords.”
Finding the Long Tail
In order to find our perfect long-tail keywords, we’re going to use a combination of four tools, all of which are free.
The process looks like this:
Use UberSuggest, KeywordShitter and a little bit of brainstorming to come up with some keywords
Export those keywords to the Google Keyword Planner to estimate traffic level
Search for those keywords with the SEOQuake chrome extension installed to analyze the true keyword difficulty
Don’t be intimidated — it’s actually very simple. For this example we’ll pretend like we were finding a keyword for this book (and we’ll probably have to build out a site so you see if we’re ranked there in a few months).
Step 1: Brainstorming and Keyword Generating
In this step we’re simply going to identify a few keywords that seem like they might work. Don’t concentrate too much on culling the list at this point, as most bad keywords will be automatically eliminated as a part of the process.
So since this is a book about growth hacking, I’m going to list out a few keywords that would be a good fit:
Growth hacking
Growth marketing
Internet marketing
Growth hacking guide
Growth hacking book
Book about growth hacking
What is growth hacking
Growth hacking instructions
That’s a good enough list to start. If you start running out of ideas go ahead and check out keywordshitter.com. If you plug in one keyword it will start spitting out thousands of variations in just a few minutes. Try to get a solid list of 5–10 to start with.
Now we’ll plug each keyword into UberSuggest. When I plug the first one — “growth hacking” — in, I get 246 results.
Clicking “view as text” will let us copy and paste all of our keywords into a text editor and create an enormous list.
Go through that process with each keyword you came up with.
Now we’ll assume you have 500+ keywords. If you don’t, try to start with something more generic and broad as a keyword, and you’ll have that many quickly. Ideally you’ll have over 1500.
Step 2: Traffic Estimating
Now that we have a pretty good list of keywords. Our next step is to figure out if they have enough search volume to be worth our while.
You’ll likely notice that some are so far down the long tail they wouldn’t do much for us. For example, my growth hacking list came up with “5 internet marketing techniques.” We probably won’t go after that one, but instead of guessing we can let Google do the work for us. This will be our weeding out step.
Google Keyword Planner
The Google Keyword Planner is a tool meant for advertisers, but it does give us some rough idea of traffic levels.
Google doesn’t make any promise of accuracy, so these numbers are likely only directionally correct, but they’re enough to get us on the right track.
You’ll have to have an AdWords account to be able to use the tool, but you can create one for free if you haven’t use AdWords in the past.
Once you’ve logged in, select “Get search volume data and trends.”
Paste in your enormous list of keywords, and click “Get search volume.” Once you’ve done so, you’ll see a lot of graphs and data.
Unfortunately the Keyword Planner interface is a little bit of a nightmare to work within, so instead we’re going to export our data to excel with the “download” button and play with it there.
Now what we’re going to do is decide what traffic we want to go after.
This varies a bit based on how much authority your site has. So let’s try to determine how easy it will be for you to rank.
Go to SEMrush.com and enter your URL, looking at the total backlinks in the third column:
As a general rule (this may vary based on how old your site is, who the links are from, etc.), based on the number of links you have, this is the maximum level of “difficulty” you should go after.
Number of Backlinks:Maximum Difficulty
<30:40
<100:40–50
<1000:50–70
1000+:70+
Go ahead and sort the data by difficulty, and eliminate all of the stuff that is too high for your site (don’t worry, we’ll get those keywords later). For now you can simply delete those rows.
Exact Match
One important thing to note is that Google gives us this volume as “exact match” volume. This means that if there is a slight variation of a keyword we will see it if the words are synonyms, but not if they are used in a phrase, so the traffic will be underestimated from what you would expect overall.
Now with that disclaimer sort the traffic volume highest to lowest, and from this data pick out five keywords that seem like a good fit.
Here are mine:
growth hacking strategies
growth hacking techniques
growth hacking 101
growth hacking instagram
growth hacking twitter
Mine all look the same, but that may not necessarily be the case.
Keyword Trends
Unfortunately the “keyword difficulty” that Google gives us is based on paid search traffic, not on natural search traffic.
First, let’s use Google Trends to view the keyword volume and trajectory simultaneously. You can enter all of the keywords at the same time and see them graphed against each other. For my keywords it looks like this:
The ones I’m most excited about are purple and red, which are “Growth hacking techniques” and “Growth hacking Twitter.”
Now we’ll take a deeper look at what the competition is like for those two keywords.
Manual Keyword Difficulty Analysis
In order to analyze how difficult it will be to rank for a certain keyword, we’re going to have to look at the keywords manually, one by one. That’s why we started by finding some long-tail keywords and narrowing the list.
This process gets a lot easier if you download the SEOQuake Chrome extension. Once you’ve done that, do a Google search and you’ll notice a few changes.
With SEOQuake turned on the relevant SEO data of each site is displayed below each search result.
We’re going to alter what is displayed, so in the left-hand sidebar click “parameters” and set them to the following:
Now when you search, you’ll see something like this:
SEOQuake adds a ranking number, and the following at the bottom:
The Google Index: This is how many pages from this base URL Google has indexed
Page Links: The number of pages linking to the exact domain that is ranking according to SEMrush’s index (usually very low compared to reality, but since we’ll be using this number to compare it wil be somewhat apples to apples)
URL Links: The number of pages pointing to any page on the base URL
Age: The first time the page was indexed by the Internet Archive
Traffic: A very rough monthly traffic number for the base URL
Looking at these we can try to determine approximately what it would take to overtake the sites in these positions.
You’ll notice that the weight of the indicators change. Not all links are from as good of sources, direct page links matter much more than URL links, etc., but if you google around and play with it for a while you’ll get a pretty good idea of what it takes.
If you have a brand new site it will take a month or two to start generating the number of links to get to page one. If you have an older site with more links it may just be a matter of getting your on-page SEO in place. Generally it will be a mixture of both.
Keep in mind that we’re going to optimize our page for this exact keyword, so we have a bit of an advantage. That said, if you start to see pages from sites like Wikipedia, you will know it’s an uphill battle.
Here are a couple of examples so you can see how you should think through these things, starting with “Growth hacking techniques.”
Entrepreneur.com is definitely a big name, and “growth hacking techniques” is in the title explicitly. This will be difficult to beat, but there are no links in the SEMRush index that point direct to the page.
(By the way, I wonder how hard it would be to write an article for entrepreneur.com — I could probably do that and build a few links to that easily, even linking to my site in the article).
Yongfook.com, have never heard of that site. 206 total links, not much traffic, this one I could pass up. It does have quite a bit of age and “Growth hacking tactics” in the title explicitly, so that would make it tough, but this one is doable to pass up after a while.
Alright, so quicksprout is relatively popular, a lot of links, good age, lots of traffic, a few links direct to the page but not a ton.
But the word “tactics” doesn’t even appear here. This page isn’t optimized for this keyword, so I could probably knock it out by being optimized specifically for “growth hacking tactics.”
On-Page SEO
Now that we have our keyword selected, we need to make sure Google knows what our site is about. This is as simple as making sure the right keywords are in the right places. Most of this has to do with html tags, which make up the structure of a webpage. If you don’t know html or understand how it works, just pass this list to a developer and they should be able to help you.
Here is a simple checklist you can follow to see if your content is optimized.
On-Page SEO Checklist
☐ Your keyword is in the <title> tag, ideally at the front (or close to the front) of the tag
☐ Your keyword is close to the beginning of the <title> tag (ideally the first words)
☐ The title tag contains less than the viewable limit of 65 characters (optional but recommended)
☐ Your keyword is in the first <h1> tag (and your page has an <h1> tag)
☐ If your page contains additional header tags (<h2>, <h3>, etc) your keyword or synonyms are in most of them
☐ Any images on the page have an <alt> tag that contain your chosen keyword
☐ Your keyword is in the meta description (and there is a meta description)
☐ There is at least 300 words of text on the page
☐ Your keyword appears in the URL (if not the homepage)
☐ Your keyword appears in the first paragraph of the copy
☐ Your keyword (or synonyms — Google recognizes them now) is used other times throughout the page
☐ Your keyword density is between .5% and 2.5%
☐ The page contains dofollow links to other pages (this just means you’re not using nofollow links to every other page)
☐ The page is original content not taken from another page and dissimilar from other pages on your site
If you have all of that in place you should be pretty well set from an on-page perspective. You’ll likely be the best-optimized page for your chosen keyword unless you’re in a very competitive space.
All we have left now is off-page optimization.
Off-Page SEO
Off-Page SEO is just a fancy way to say links. (Sometimes we call them backlinks, but it’s really the same thing.)
Google looks at each link on the web as a weighted vote. If you link to something, in Google’s eyes you’re saying, “This is worth checking out.” The more legit you are the more weight your vote carries.
Link Juice
SEOs have a weird way to describe this voting process; they call it “link juice.” If an authoritative site, we’ll say Wikipedia for example, links to you, they’re passing you “link juice.”
But link juice doesn’t only work site to site — if your homepage is very authoritative and it links off to other pages on your site, it passes link juice as well. For this reason our link structure becomes very important.
Checking Link Juice
There are a number of tools that let you check how many links are pointing to a site and what the authority of those pages are. Unfortunately none of them are perfect — the only way to know what links are pointing to your site is to have crawled those pages.
Google crawls most popular pages several times per day, but they don’t want you manipulating them, so they update their index pretty slowly..
Link Structure
HTML links look something like this:
<a href=”http://www.somesite.com” title=”keyword”>Anchor text</a>
Where http://www.somesite.com is the place the link directs you to, the title is largely a remnant of time gone by, and the linked text — think the words that are blue and you click on — is called the “anchor text.”
In addition to the amount of link juice a page has, the relevance of the anchor text matters.
Generally speaking you want to use your keyword as the anchor text for your internal linking whenever possible. External linking (from other sites) shouldn’t be very heavily optimized for anchor text. If 90% of your links all have the same anchor text Google can throw a red flag, assuming that you’re doing something fishy.
If you’re ever creating links (like we’ll show you in the future) I only ever use something generic like the site name, “here” or the full URL.
Internal Structure
Generally speaking you don’t want orphan pages (those that aren’t linked to by other pages), nor do you want an overly-messy link structure.
Some say the ideal link structure for a site is something like this:
That’s close, but it gets a couple things wrong. First, you’ll never have a structure that organized, and second, in an ideal world every page would link to every other page on its same level. This can easily be done with a footer that feels like a sitemap or “recommended” pages. That allows you to specify anchor text, and pass link juice freely from page to page.
Unfortunately it’s impossible to draw such a web without it becoming a mess, so you’ll just have to imagine what that actually looks like.
We have just one more thing to go over before we start getting those first links pointing to our site.
Robots.txt, disavow, nofollow, and other minutia###
Most of SEO at this point is now managing stuff that can go wrong. There is a lot of that, but we’ll go over what will cover 99% of needs, and you can Google if there’s something really crazy.
Robots.txt
Almost every site has a page at url.com/robots.txt — even google has one.
This is just a plain text file that lets you tell search engine crawlers what to crawl and not to crawl. Most are pretty good about listening, except the Bingbot, which pretty much does whatever it wants no matter what you tell it. (I’m mostly kidding.)
If you don’t want Google to crawl a page (maybe it’s a login page you don’t want indexed, a landing page, etc.) you can just “disallow” it in your robots.txt by saying disallow: /somepage.
If you add a trailing / to it (e.g. disallow: /somepage/) it will also disallow all child pages.
Technically you can specify different rules for different bots (or user agents), but it’s easiest to start your file with “User-agent: *” if you don’t have a need for separate crawling rules.
Disavow
Google will penalize spammy sites, and unfortunately this causes some bad behavior from bad actors. Say, for example, you wanted to take out a competitor. You could send a bunch of obviously spammy links to their site and get them penalized. This is called “negative SEO,” and is something that happens often in highly contested keywords. Google generally tries to pretend like it doesn’t happen.
In the case that this does happen, however, you can “Disavow” links in the Search Console, which is pretty much saying, “Hey Google, don’t count this one.” I hope you’ll never have to use it, but if you hire (or have hired) a bad SEO or are being attacked by a competitor, that is how you combat it.
Nofollow
A link can have a property called “nofollow” such as this:
<a href=”http://www.somesite.com” title=”keyword” rel=”nofollow”>Anchor text</a>.
If you want to link to somebody but you don’t want it to count as a vote (you don’t want to pass link-juice), or you support user-generated content and want to deter spammers, you can use a nofollow link. Google says it discounts the value of those links. I’m not convinced they discount them heavily, but other SEOs are so they seem to deter spammers if nothing else.
Redirects
If you’re going to change a URL, but you don’t want its link juice to disappear, you can use a 301 redirect. A 301 will pass a majority of the link juice.
Importantly, Google views www.austenallred.com and austenallred.com as different sites. So decide on one, and redirect all of one type to the other.
Canonical URLs
If you have two pages that are virtually the same, you can add something like <link rel=”canonical href=”https://www.someurl.com/somepage”> to say “hey, treat this page as if it were that page instead, but I don’t want to 301 it.”
And with that, we’re ready to build our first links.
Link Building
Link building is where SEO really starts to matter, and where a lot of people end up in a world of hurt.
The best way to build links is to not build links. I’ve worked for companies in the past that don’t have to ask for them, they just flow in from press, customer blogs, their awesome blog posts, etc. If this is an option (and we’ll go over a couple of ways to make it more likely) you’re in a great place.
If not, at least in the beginning, we’re going to manually create just a few.
We’re going to create them in legitimate ways and not hire somebody in India to do so. That is a recipe for disaster, and I can’t even count the number of times I’ve seen that take down a site.
Web 2.0s The easiest way to build high quality links are what SEOs call “web 2.0s.” That’s just a way to say “social sites” or sites that let you post stuff. Now tweeting a link into the abyss won’t do you anything, but profiles, status pages, etc. do carry some weight. And if they come from a popular domain that counts as a link.
Some of the easiest are:
Twitter (in your bio)
Github (the readme of a repo)
YouTube (the description of a video — it has to actually get views)
Wordpress (yes, you’ll have to actually create a blog)
Blogger (same here)
Tumblr
Upvote-based sites (HackerNews, GrowthHackers, Inbound.org, Reddit, etc.)
If nothing else you can start there and get a half dozen to a dozen links. There are always big lists of “web 2.0s” you can find online, but keep in mind if you’re going to build something out on a blogging platform you’re going to have to really build something out. That’s a lot of content and time, but you have to do it the right way.
We generally keep a bigger list of Web 2.0s here. Some may be out of date, but you should probably only build a half dozen to a dozen Web 2.0s anyway.
Expired Domains
Another way to get link juice is by purchasing an expired domain. This is more difficult to do, but there are a lot of options such as expireddomains.net. (Google “expired domains” and you’ll find dozens of sites monitoring them.)
You’ll want to purchase a domain that has expired and restore it as closely as you can to its original form using an archive. These sites likely have some link juice to pass on and you can pass it to yourself.
Link Intersection
Another way to find places you can build links is by using a link intersection tool. These find sites that link to “competitor a” and “competitor b” but not to you. Theoretically, if they link to both of your competitors, they should be willing to link to you. Moz, Ahrefs, LunaMetrics and others have link intersection tools that work quite well.
Now that we have a few basic links flowing, we’re going to work on some strategies that will send continual links and press, eventually getting to a point where we don’t have to build any more links.
Your First Drip of Traffic — Becoming an Authority Site
Awesome — you have a site that converts well, your SEO is in place, ready for you to drive traffic. Now what?
As you’re probably learned at this point, a site that converts very well but has no traffic flowing to it still converts zero traffic.
We’re going to fix that.
This section takes a lot of time and effort, and in the beginning you’ll likely wonder if you’re doing anything at all. Remember that class in college that is so difficult it’s the point where most people give up, effectively weeding out the people who aren’t ready to major in a specific subject?
As you get further along and have an idea of how to get a good response, I’d recommend starting to sort by reach, ramping up the number of keywords you’re searching for, and possibly even -gasp- upgrading to the paid version of BuzzBund