Note: All the topics are explained in detail with extra content which may be not in the syllabus/textbook. Use the ppts for extra knowledge/reference.
For studying use the Textbook
Part 4 - PPT
Keywords
In terms of search engines, a keyword is any search term entered on Google (or another search engine) that has a results page where websites are listed.
Keywords are ideas and topics that define what your content is about. In terms of SEO, they're the words and phrases that searchers enter into search engines, also called "search queries."
<meta name=“keywords”>
It is used to a list of keywords and phrases that are matching the content of your web page.
Example:
<meta name=“keywords” content=“compression springs, extension springs, drawbars springs”>
Keywords can be broad and far-reaching (these are usually called "head keywords or short tail keyword"), or they can be a more specific combination of several terms — these are often called "long-tail keywords."
Example – Short Tail Keyword
Marketing
Marketing Management
Oppo mobile
Example – LongTail Keyword
Marketing strategies for small business
Oppo mobile with 20MP camera
Usage of Keywords
You probably already know that you should add keywords to pages that you want to rank.
But where you use your keywords is just as important than how many times you use them.
- In Web page URL
- In the First 100 words of your web page content
- Make sure that your keyword appears at least once in your page’s title tag.
What is SEO Audit?
An SEO Audit helps to find out what could be done to improve ranking on search engines, so that consumers could find the website with greater ease.
SEO audits take a deep dive into your site to evaluate numerous factors that impact your ability to rank in search engine results pages.
These elements include your website’s on- and off-page SEO, as well as technical SEO performance.
Why SEO Audit is important?
Things change very quickly in the SEO industry and what is working today may not work in 6 months from now. Google reportedly makes thousands of updates to their ranking algorithm per year and an SEO audit will help you stay in synch.
It is necessary to perform regular SEO audits (at least 2 times per year) to ensure that your website is up-to-date with the latest developments.
SEOptimer is a free SEO Audit Tool that will perform a detailed SEO Analysis.
It provides a clear and actionable recommendations that can be taken to improve your online presence.
Popular SEO Audit Tools
Alexa Site Audit
Check My Links
DareBoost
DeepCrawl
Google Webmaster / Search Console
Hubspot’s Marketing Grader
Moz Crawl Test
MySiteAuditor
SE Ranking Website Audit
Ways to Optimise Website
i) HTML Headers
ii) Body Content
iii) Links
iv) Crawling and Indexing
v) Robots.txt file
vi) Device Rendering
vii) Legible font size
viii) Tap Target sizing
What is Googlebot?
Googlebot is the name of Google's web crawler, which constantly scans documents from the world wide web and makes them available for Google’s index and Google Search.
It uses an automated process to continuously search for new content on the world wide web in the same way as a regular web browser: The bot sends a request to the responsible web server, which then responds accordingly.
After that, it downloads a single web page that can be reached under a uniform URL and stores it in Google’s index. In this way, Google’s crawler indexes the entire internet, using distributed and scalable resources to crawl thousands of pages simultaneously.
What is robots.txt file?
Robots.txt is a text file webmasters create to instruct web robots (typically search engine robots) how to crawl pages on their website.
The robots.txt file is part of the robots exclusion protocol (REP), a group of web standards that regulate how robots crawl the web, access and index content, and serve that content up to users.
The REP also includes directives like meta robots, as well as page-, subdirectory-, or site-wide instructions for how search engines should treat links (such as “follow” or “nofollow”).
In practice, robots.txt files indicate whether certain user agents (web-crawling software) can or cannot crawl parts of a website. These crawl instructions are specified by “disallowing” or “allowing” the behavior of certain (or all) user agents.