It online Exam

Tuesday 25 August 2020

2. Digital Marketing - Part 4 - SEO - Keywords - SEO Audit - Ways to optimize web pages

Note: All the topics are explained in detail with extra content which may be not in the syllabus/textbook. Use the ppts for extra  knowledge/reference. 

For studying use the Textbook

Part 4 - PPT



Keywords
In terms of search engines, a keyword is any search term entered on Google (or another search engine) that has a results page where websites are listed. 

Keywords are ideas and topics that define what your content is about. In terms of SEO, they're the words and phrases that searchers enter into search engines, also called "search queries." 


<meta name=“keywords”>
It is used to a list of keywords and phrases that are matching the content of your web page.

Example:
<meta name=“keywords” content=“compression springs, extension springs, drawbars springs”>

Keywords can be broad and far-reaching (these are usually called "head keywords or short tail keyword"), or they can be a more specific combination of several terms — these are often called "long-tail keywords."  

Example – Short Tail Keyword
Marketing
Marketing Management
Oppo mobile

Example – LongTail Keyword
Marketing strategies for small business
Oppo mobile with 20MP camera

Usage of Keywords
You probably already know that you should add keywords to pages that you want to rank.

But where you use your keywords is just as important than how many times you use them.
  • In Web page URL
  • In the First 100 words of your web page content
  • Make sure that your keyword appears at least once in your page’s title tag.

What is SEO Audit?
An SEO Audit helps to find out what could be done to improve ranking on search engines, so that consumers could find the website with greater ease.

SEO audits take a deep dive into your site to evaluate numerous factors that impact your ability to rank in search engine results pages.
These elements include your website’s on- and off-page SEO, as well as technical SEO performance. 

Why SEO Audit is important?
Things change very quickly in the SEO industry and what is working today may not work in 6 months from now. Google reportedly makes thousands of updates to their ranking algorithm per year and an SEO audit will help you stay in synch.
It is necessary to perform regular SEO audits (at least 2 times per year) to ensure that your website is up-to-date with the latest developments.


SEOptimer is a free SEO Audit Tool that will perform a detailed SEO Analysis.
It provides a clear and actionable recommendations that can be taken to improve your online presence.

Popular SEO Audit Tools
Alexa Site Audit
Check My Links
DareBoost
DeepCrawl
Google Webmaster / Search Console
Hubspot’s Marketing Grader
Moz Crawl Test
MySiteAuditor
SE Ranking Website Audit

Ways to Optimise Website
i) HTML Headers
ii) Body Content
iii) Links
iv) Crawling and Indexing
v) Robots.txt file
vi) Device Rendering
vii) Legible font size
viii) Tap Target sizing







What is Googlebot?
Googlebot is the name of Google's web crawler, which constantly scans documents from the world wide web and makes them available for Google’s index and Google Search. 

It uses an automated process to continuously search for new content on the world wide web in the same way as a regular web browser: The bot sends a request to the responsible web server, which then responds accordingly. 
After that, it downloads a single web page that can be reached under a uniform URL and stores it in Google’s index. In this way, Google’s crawler indexes the entire internet, using distributed and scalable resources to crawl thousands of pages simultaneously.

What is robots.txt file?
Robots.txt is a text file webmasters create to instruct web robots (typically search engine robots) how to crawl pages on their website. 
The robots.txt file is part of the robots exclusion protocol (REP), a group of web standards that regulate how robots crawl the web, access and index content, and serve that content up to users. 
The REP also includes directives like meta robots, as well as page-, subdirectory-, or site-wide instructions for how search engines should treat links (such as “follow” or “nofollow”).
In practice, robots.txt files indicate whether certain user agents (web-crawling software) can or cannot crawl parts of a website. These crawl instructions are specified by “disallowing” or “allowing” the behavior of certain (or all) user agents.





Tuesday 4 August 2020

2. Digital Marketing - Part 3 - SEO- On Page-Off Page-Technical SEO

Note: All the topics are explained in detail with extra content which may be not in the syllabus/textbook. Use the ppts for extra  knowledge/reference. 

For studying use the Textbook

Part 3 - PPT


SEO Types
There are number of ways to approach SEO to generate traffic to your website:
  • On-page SEO
  • Off-page SEO
  • Technical SEO

On-page SEO
On-page SEO is the practice of optimizing individual web pages in order to rank higher in organic search and earn more relevant traffic in search engines
On-page refers to both the content and HTML source code of a page that can be optimized.

Content pages are the meat of websites and are almost always the reason visitors come to a site. 
Ideal content pages should be very specific to a given topic—usually a product or an object—and be hyper-relevant.

The purpose of the given web page should be directly stated in all of the following areas:
  • Title tag
  • URL
  • Content of page
  • Image alt text
  • Headlines – Header Tags

How do you keep users on your site longer?
  • Use lots of bullets and subheadings.
  • When your content is easy to read, people will spend more time on your site. (It also stops them from hitting their “back” button)
  • As it turns out, bullets and subheadings make your content MUCH easier to read

Keywords
In terms of search engines, a keyword is any search term entered on Google (or another search engine) that has a results page where websites are listed. 

Keywords are ideas and topics that define what your content is about. In terms of SEO, they're the words and phrases that searchers enter into search engines, also called "search queries." 


An Ideally Optimized Web Page should do all of the following:
Be hyper-relevant to a specific topic (usually a product or single object)
  • Include subject/keywords in title tag
  • Include subject/keywords in URL
  • Include subject/keywords in image alt text
  • Specify subject several times throughout text content
  • Provide unique content about a given subject
  • Linking 
Link back to its category page
Link back to its subcategory page (If applicable)
Link back to its homepage (normally accomplished with an image link showing the website logo on the top left of a page)

Use descriptive alt tags for images (and optimize your image file names)

Example : 
<img src=“cute-cat.png” alt=“A picture of a super cute cat.”>


Off-page SEO
  • Off-Page SEO refers to all of the activities that you and others do away from your website to raise the ranking of a page with search engines.
  • On-page search engine optimization happens within the site, while off-page SEO happens outside the site.
  • Common off-page SEO actions include building backlinks, encouraging branded searches, and increasing engagement and shares on social media.
  • If you write a guest post for another blog or leave a comment, you’re doing off-page site promotion.

Off-site SEO is about getting attention to your website thanks to outbound activities, unrelated to your page contents. Hence promote your business through videos, blogging, podcasts, infographics etc.


It is extremely important to note that Off-Page SEO accounts for the majority of your ability to rank highly for a particular keyword. Therefore, it is something that should not be overlooked in your online marketing strategy. 

Benefits Of Doing Off-page SEO 
The benefits of implementing Off page SEO are:
  • Increase in your page ranking
  • Growth of reach
  • Improved online visibility
  • Increase your Search engine results page
  • Improves conversion rates


Technical SEO
  • Technical SEO is the process of ensuring that a website meets the technical requirements of modern search engines with the goal of improved organic rankings. 
  • Making a website faster, easier to crawl and understandable for search engines are the pillars of technical optimization. 

  • Technical SEO is a broad and exciting field, covering everything from sitemaps, meta tags, JavaScript indexing, linking, keyword research, and more.
  • Technical SEO is part of on-page SEO, which focuses on improving elements on your website to get higher rankings. 

1. Speed of loading
Web pages need to load fast. 
People are impatient and don’t want to wait for a page to open. 
A research shows that 53% of mobile website visitors will leave if a webpage doesn’t open within three seconds.
Google knows slow web pages offer a less than optimal experience. Therefore they prefer web pages that load faster. 
So, a slow web page also ends up further down the search results than its faster equivalent, resulting in even less traffic.
2. Ensure your site is mobile-friendly
A ‘responsive’ website design adjusts itself automatically so that it can be navigated and read easily on any device.

Google is clear about the fact that having a responsive site is considered a very significant ranking signal by its algorithms. So it makes sense to ensure that your website is fully responsive and will display in the best format possible for mobile, tablet or desktop users.

3. Fix duplicate content issues.
Duplicate content can either be confusing for users (and indeed search engine algorithms); it can also be used to try to manipulate search rankings or win more traffic.

As a result, search engines aren’t keen on it, and Google and Bing advise webmasters to fix any duplicate content issues they find.

4. Create an XML sitemap.
An XML sitemap is a file that helps search engines to understand your website whilst crawling it – you can think of it as being like a ‘search roadmap’ of sorts, telling search engines exactly where each page is.

It also contains useful information about each page on your site, including

when a page was last modified;
what priority it has on your site;
how frequently it is updated.

5. Add structured data markup to your website.
Structured data markup is code which you add to your website to help search engines better understand the content on it. This data can help search engines index your site more effectively and provide more relevant results.

6. It doesn’t have (many) dead links
We’ve discussed that slow websites are frustrating. What might be even more annoying for visitors than a slow page, is landing on a page that doesn’t exist at all. If a link leads to a non-existing page on your site, people will encounter a 404 error page. There goes your carefully crafted user experience!

What’s more, search engines don’t like to find these error pages either. And, they tend to find even more dead links than visitors encounter because they follow every link they bump into, even if it’s hidden.

Unfortunately, most sites have (at least) some dead links, because a website is a continuous work in progress: people make things and break things. 

7. It’s secure (Use of HTTPS - SSL certificate)
A technically optimized website is a secure website. Making your website safe for users to guarantee their privacy is a basic requirement nowadays. 

HTTPS makes sure that no-one can intercept the data that’s sent over between the browser and the site. So, for instance, if people log in to your site, their credentials are safe. You’ll need a so-called SSL certificate to implement HTTPS on your site. Google acknowledges the importance of security and therefore made HTTPS a ranking signal: secure websites rank higher than unsafe equivalents.

2. Digital Marketing - Part 2 - Search Engine-Paid Search-Organic Search-SEO-White Hat-Black Hat

Note: All the topics are explained in detail with extra content which may be not in the syllabus/textbook. Use the ppts for extra  knowledge/reference. 

For studying use the Textbook

Part 2 - PPT

What is Search Engine?
A web search engine or Internet search engine is a software system that is designed to carry out web search, which means to search the World Wide Web in a systematic way for particular information specified in a textual web search query. 

Search engines are answer machines. 

They exist to discover, understand, and organize the internet's content in order to offer the most relevant results to the questions searchers are asking.

The search results are presented in a line of results often referred to as Search Engine Results Pages (SERPs)

The results are a mix of links to web pages, images, videos, info-graphics, articles, research papers, and other types of files.


Displaying of Websites in SERP
In order to show up in search results, your website content needs to first be visible to search engines. 

It's arguably the most important piece of the SEO puzzle: If your site can't be found, there's no way you'll ever show up in the SERPs (Search Engine Results Page).



How Search Engine works?
Search engines have three primary functions:

Crawl: Discover the Internet for content, looking over the code/content for each URL they find.


Crawling is the discovery process in which search engines send out a team of programs (known as crawlers or spiders or bots) to find new and updated content.  
They scan a website and collect details about each page : titles, images, keywords, other linked pages, etc.
Content can vary — it could be a webpage, an image, a video, a PDF, etc. — but regardless of the format, content is discovered by links.
Googlebot starts out by fetching a few web pages, and then follows the links on those webpages to find new URLs. By hopping along this path of links, the crawler is able to find new content and add it to their index 



Index: Store and organize the content found during the crawling process. Once a page is in the index, it’s in the running to be displayed as a result to relevant queries.


Search engines processes and stores information of websites they find;  in an index, a huge database of all the content they’ve discovered and deem good enough to serve up to searchers.


When someone performs a search, search engines look into their index for highly relevant content and then orders that content in the hopes of solving the searcher's query.  



Ranking: Provide the pieces of content that will best answer a searcher's query, which means that results are ordered by most relevant to least relevant. This ordering of search results by relevance is known as ranking. 




Ways to Rank Website
There are two ways to Rank a website

One can pay and rank the website in top list (paid search results)

Without paying can rank the website using SEO process (appears in organic search results)

Organic Search Results
Organic search results are the listing on a search engine results page (SERP) that appear because of factors such as relevance to the search term and valid search engine optimization efforts rather than because of search engine marketing or trickery.

SEO
Search Engine Optimization is the process of boosting content and technical 
set-up of the website so that it appear at the top of a search engine result for specific keywords.

SEO Techniques
Techniques and strategies used to get higher search rankings for websites can be classified into two:
  • White Hat SEO
  • Black Hat SEO

White Hat SEO
White hat SEO involves looking for ways to improve user experience ethically and genuinely.
It ensures that web page content should have been created for the users and not just for the search engines.

White hat SEO is the use of tactics and strategies that follows all search engine guidelines and policies.

White hat SEO is the complete opposite of Black Hat SEO. 

Any practice that aims to improve a website’s search rankings while still keeping its integrity and is in-line with search engine guidelines is considered to be an example of white hat SEO.

Some specific white hat strategies are:
  • Content that’s written for the users
  • Fast site speed
  • Mobile-friendliness
  • Easy site navigation
  • Proper and natural use of keywords inside the content and meta tags

White hat SEO focuses mainly on content – basically on how user-friendly, informative, and useful your content is. 

When your content is good, people will talk about it in different social media channels and that will lead to a massive increase in traffic. 

It will also help your rankings since content that satisfies the three factors is ranked higher in the search engines. 


Black Hat SEO
‘Black Hat’ SEO deals with a bunch of unethical exercises that are performed to elevate website’s ranking on any search engine by violating the search engine guidelines.

It is also referred as unethical SEO as it includes immoral black shady tricks and techniques to gain rankings and traffic.

(Not in Syllabus - For Knowledge)
i. Keyword Stuffing
Keyword repetition is a very common technique used in ‘black hat SEO’ where the same word is used multiple times to increase its SEO. It will end up in corrupting that website and creating a very bad experience for the user.

ii. Cloaking
It means to provide one content to the user and different contents to search engine for the same search.
For example, writing making a website for selling of suspense novels and putting inside some irrelevant data of suspense movies as Meta description.

iii. Irrelevant URL’s as backlinks
Using irrelevant URL’s as backlinks is another popular technique. In this technique, some URL’s are used in order to redirect the user to some other irrelevant websites. These techniques are mostly used in websites with high domain authority. 
For example– Quota, car vale etc

iv. Spam Comment
Commenting on a blog with some links to get a follow backlink is also a type of ‘black hat ‘technique to improve domain authority. It creates spam for the new user.
Getting spammy comments links to your site. When creating backlinks from another website to your website. in a short period. this trick may give you a temporary boost in SEO, But in along run, it’s not going to work. However, you can drive traffic to your website in 2019, by using this simple black hat method.


Redirects are normally used only when you’re moving domains, or your website is down for some reason. They’re a welcome practice with user engagement.
However, Black Hat SEO uses false links to redirect users to a different website. Think of every time you clicked a link to a page, only to have another tab pop open, taking you to a gambling website. 
That is called shady redirecting. 

Private Blog Networks
A PBN is what it says: a network of blogs run by a single person, preferably on different domains. Their sole purpose is to create a huge amount of links for the “parent” site and help boost its rankings. The sites don’t need to be updated frequently, nor feature enticing content. All they need are 3rd party links. 


Conclusion – Black Hat vs White hat
“Thoroughly, Truly, the one and only result of Black Hat SEO is either drop down of your business or a complete destruction of your business.”
Now take it down, if you are thinking of re-building a brand or uplifting your business, Black Hat SEO is never at all a solution! 
In contrast, it can only affect your brand via the negative way. 
It disrupts user experience, swaps information reputation, damage your presence in search and most importantly can get you removed completely from search engines with punishments such as penalty and many others. 
Thereby always keep in mind that Black hat SEO is never at all the way to go. It may seem to work in short term but it’s a sure assurance to bring consequences in future.
Also search engines have got sophisticated a lot presently. They have become better at spotting SEO tactics. If you want your dividends to be doubled and have a sleep never seeing a dip in your brand due to nasty approaches, never ever do Black Hat SEO.

Focus on your brand, create quality content and stop playing with search engines, internet will focus you. It’s a way far better, helping you exist in long term, bunging your website from bringing about a penalty than can remove your visibility from search engines.