SEO AGENCY

SEO Company behind 700% increase in revenue for a hospitality business in 7 months.

A $2.8 million growth for a company that was doing less than $420K in revenue.

Most SEO agencies have it wrong. They guarantee you #1 rank on Google in 90 days for specific keywords. Why pay for something that does not convert into dollars ?

We are Mojo Dojo a top rated global SEO agency.

99% of websites lose. Almost all of them make the same fatal error.

They focus on traffic, not revenue.

#1 page ranking on Google is great for a trophy keyword. These three-four keyword combinations ranked by your SEO agency have no search volume.

That means, no one is searching for these trophy keywords.

We bet your SEO agency has offered you the 90 day guarantee ?  Here is how the pitch goes – “We will work for free if you are not on the first page of Google in 90 days.” Sounds familiar ?

Only for you to sign on the dotted line, pay the deposit and then offer these trophy keywords that no one is searching for.

Maybe your agency is promising you a “guaranteed” way of ranking #1 on Google ? In 90 days, no more. They obviously know this “one weird” trick..

We have been in business for over 15 years globally and have heard this pitch repeated in many different countries. The results are almost always the same –  “painful”

But that’s not all….

❌ You are wondering when you will get a response to the email you sent a week ago. Your account manager is absconding your calls and there is no one else to talk to who “knows your account”

❌ There is a complete lack of transparency and you know that your agency believes that educating you will make you leave or hire in house talent.

❌ Progress is slow and communication is even slower.

❌ Unrealistic expectations were set to win your business. You find them constantly over promising and under delivering including setting unrealistic expectations.

❌ The impact of their work will be unknown until 12 -18 months later

❌ Their proposal has a lot of ifs and buts including a lot of disclaimers

only if there was a way to work our the best from the rest…

What you may need is a framework that focuses on SEO organic rankings without taking any shortcut.

We don’t care about rankings, traffic or any other metric your agency is trying to skew to make it look promising.

We care about revenue.

We’ll help you launch and scale digital marketing channels that are best suited for your type of industry. Who are we to make such tall promises ?

Contextual SEO raised revenue of a company by $2.8 million in 7 months with a 26% increase in high intent keywords SEO.

If you pick high intent keywords to rank, even a lower traffic volume can substantially increase your revenue.

These high intent keywords require hours of research to uncover. SEO services focused on cookie cutter approach will hardly have the time allocated for in-depth keyword research.

A very prominent construction company 2xed the organic traffic.

Not only did they nearly double their traffic, they also increase the amount of inquiries by 4 times.

The prominent real estate company sells houses in the vicinity of $3-4 million.

This is just the result of a single campaign for a part of their website.

Another supplies company increased their revenue by 3X and increased organic traffic ( SEO ) by 136%.

Their eCommerce store had never done more than $26K per month.

Our SEO campaign netted them a cool $102K per month.

An energy company increased their organic traffic by 77% and increased their inquiries by 3X from our SEO services.

Another real estate company increased their organic traffic by 58% and inquiries by 218%.

SEO is incredibly powerful that can benefit all types of businesses, regardless of their niche or industry.

Whether you’re a small startup or an established corporation, implementing effective SEO strategies can lead to an increase in organic traffic and a boost in inquiries/sales.

For instance, consider the case of an aviation technology company that was struggling to attract new customers and grow their business.

By deploying a comprehensive SEO campaign, they were able to achieve remarkable results – in just a short period of time, their organic traffic increased by an incredible 45%, while their inquiries shot up by a staggering 105%.

Another specialist item company who competes with the likes of Bunnings increased their traffic by 75% and revenue by 500%

Another parts supplier & hardware company experienced tremendous growth with our SEO services, increasing their revenue by a whopping 6x and significantly ramping up their transactions, boosting them by a remarkable 7x.

Another prominent art gallery in Melbourne increased their organic traffic by 273%

SEO agency with a scientifically proven A/B tests on your SEO that is statistically more likely to succeed than your SEO agency.

Has your SEO agency explained to you what is it that they do ?

Are they giving you a plan of action on what they will do in the next 90-120 days ?

Is their website loaded with disclaimers ?

Using the simple mathematical principles, let us show you how much revenue impact we can make.

We will give you a clear, detailed, digestible SEO strategy that people have paid millions of dollars to learn.

We don’t believe in guaranteeing you ranking for a set number of keywords. Instead, we will show you how to focus on 5–10 big organic SEO wins that are worth hundreds of thousands of dollars.

Founded in 2009, Mojo Dojo has a long history of running experiments to test numerous SEO hypothesis.

Our research is so extensively cited that it has made it into the academic coursework.

When top companies change their websites they usually test them using A/B software but have you heard of companies applying the same methodologies to SEO ?

We are perhaps the only company that we know of that has a scientific model to testing SEO hypothesis including experimenting with various on page factors using a A/B test.

SEO Agency FAQs

Search Engine Optimisation is an iterative process. The SEO process includes six general phases: Research, including business research, competitor analysis, current state assessment, and keyword searching. Often, the real questions are about usability: what makes it hard for search engines and your users to navigate through your site and find the right information?

A user-friendly website sustains and nurtures users. Plenty of factors affect your search engine rankings, and one of the most critical factors is user experience. By keeping your users happy, you keep your market engaged. Our SEO professionals will conduct a site audit to indicate areas of improvement.A website audit includes an initial review of the site design, web usability and site architecture.

Search Engine Optimisation is still about your customers. It is not about headless browsers and Document Object Model (DOM) elements. Using dubious methods to trick algorithms is also not a long-term sustainable strategy. That’s why our product delivers steady, long-term organic traffic growth.

We at MojoDojo organise our SEO marketing services and suggested strategy execution in the order of its impact. We often collaborate closely with the client to implement the proposed SEO marketing strategy or get our in-house team to help with implementation. Our agency prides itself on working with the best people around the world. The people working on your business aren’t your usual account managers: they are successful entrepreneursrenowned experts, employees from top technology companies and published authors.

Effective SEO is about taking on your competitors where they are weak. You can figure out all the techniques in a month, but applying them to the right places is best left to people who manage it day in, and day out.

Knowing your business, understanding everything about your profitable products, and learning about your weaker corners and constraints helps adjust your strategy for maximum ROI. Our SEO services also cater to improving user experience via CRO and improving speed for your website by utilising cloud technology.

When you work with us, your business website will gain from the use of our in-house toolset built using machine-learning by the industry’s top SEO professionals and developers. Our proprietary machine-learning Search Engine Optimisation tools built for internal use help work on best opportunities focusing on immediate wins and delivering the highest value in the shortest time. Our SEO packages are built for businesses with a long-term growth plan and products or services to match.

Yesterday !

but here are some more specific reasons.

  • When launching a new website – it is critical that when launching a new website that migration of old URLs and data be done so that you do not lose any rankings.
  • During a site redevelopment – don’t let designers and developers with little understanding of SEO go about destroying the value you may have build over years.
  • SEO growth – you obviously want more leads and sales and have been struggling.
  • Before you commence PPC – Most people will tell you that SEO & PPC are not related but they never tell you about quality scores and how PPC quality scores are dependent upon landing page relevance.

Yes.

SEO (Search Engine Optimization) can be very effective for increasing website traffic and visibility.

Here’s why:

Higher Ranking: The core function of SEO is to optimize a website to rank higher in search engine results pages (SERPs) for relevant keywords. This increases the chances of people finding your site when they search for those terms. The higher you rank, the more likely you are to get clicks.

Organic Traffic: SEO drives organic traffic, which means people come to your site naturally through search engines, rather than through paid advertising. This can be a more sustainable and cost-effective way to attract visitors.

Brand Awareness: Higher ranking and organic traffic can lead to increased brand awareness. When people see your website appearing for relevant searches, they become familiar with your brand and what you offer.

A team of Mojo Dojo SEO marketing experts drives our strategy services. This includes a Director of SEO, Content Marketing Manager, and Content Marketing Specialist.

They’re joined by an Account Manager to ensure a cohesive communication approach. To ensure every project aligns with our overall strategy, our CEO provides strategic oversight at the outset.

Technical SEO FAQs

On-page optimisation focuses on what we can do on your website to improve visibility to search engines. From simple title tag checks to site speed optimisation we optimise your page to outrank your competition. The technical side of the SEO Audit will also check for Schema implementation and other meta-data that Google recognises and indexes such as:

  • Content type, taxonomies, nodes revision
  • Meta tag analysis and review
  • Internal & external link structure analysis
  • Content analysis
  • Duplicate content
  • Review of URL’s, keywords, and images
  • On-page performance monitoring
  • Page fetch times
  • External links
  • Thin pages
  • Content/HTML ratio
  • HTTP response codes
  • Malformed, Excluded and restricted URLs
  • Crawl depth
  • Meta, Header, Robots Noindex
  • Google penalties

John Mueller in a tweet said domain age does nothing for SEO.

Domain age in SEO

It may be worth noting that domain age does show up in Google search results.

Click the three dotted line next to any search results and you will see the following

domain age in SERPs

Domain Sanboxing

Various webmasters have reported their domain to be sandboxed in the first few months of launching.

Sandboxing is when Google restricts new sites from ranking higher on search engines.

This may be despite you have a very good link profile.

Here is Rand Fishkin the founder of Moz.

Domain age sandboxed

Google denies the existence of sandbox.

google sandbox denial

Keywords in Top Level Domain

Having a keyword in your top level domain used to be a popular strategy back in 2010.

This was overly abused by webmasters with hundreds and thousands of domains registered by domain squatters and PBN builders.

It used to provide relevancy signal but now having a keyword in top level domain is not a ranking factor.

Domain History

In a video posted on YouTube by Google Search Central team, Matt Cutts the former head of webspam said that a penalty from an old domain may carry over to the new owner.

This is in light of abused by PBN builders who simply use 301 from old domains to pass authority to new one.

 

Just as the penalty would carry over to the new owner, in our tests with hundreds of expired domains, the positive authority also carries through.

It is always a good idea to look at Domain’s past ownership, content on that domain, history of registrations and also any wayback results pointing to webspam in general.

So how does Google and other search engines decide what title and description to show for each page ?

Google says that it uses a number of different sources to determine the best title for the query but also provides you the webmaster with a way to indicate your preference.

Usually, if you write a clear & concise title, Google tends to respect your title.

 

 PRO TIP

The best practices for indicating your preferences of the title are

  • Use the <title> element in the <head> to indicate your title preference in the document structure.
  • Write a title that is descriptive and concise. Avoid overdoing the titles by stuffing as many keywords as possible. Keyword stuffing also generally leads to a poor CTR.
  • If you have a javascript based site, have it as high in the head as possible. Even better, avoid javascript based sites and opt for static sites.

 

Google considers some of the following to choose which title to show

  • <title>
  • H1 of your document
  • H2-H6 if the primary H1 is less relevant
  • Other content in the body of the page
  • Other text contained in the page
  • Anchor text on the page
  • Anchor text of links pointing to the page

Google starts rewriting titles when you have a) over optimized b) stuffed with keywords c) repetitive titles across the site d) to match the user’s query and improve CTR.

Google only displays 50 to 60 characters of a page’s title tag.

For maximum visibility, it’s recommended that you follow this guidance by keeping your pages’ title tags under 60 characters.

A snippet or a description also influences your CTR like the title.

It is shown in the summary part of the search results and users often use that to make a decision on whether the page will answer their query.

Meta descriptions aren’t used as a direct ranking signal. Having a great meta description leads to better CTR on SERPs which is indirectly a ranking factor.

Google generally uses the meta description tag on the page to determine the most appropriate snippet to show on the result pages.

Google will also consider structured data of the page to determine the best fit (more on this later).

However, it will show different snippets based on a number of factors such as

  • Keyword stuffing the description tag.
  • Repetitive description tag across the site
  • Generalizes the concept but doesn’t summarize the page.
  • Either too short or too long.

The best practice to write great meta descriptions to show on the Google search snippets are

  • Create a unique description for each page of the site.
  • Include and topicalize each description to the theme of the page
  • Use attributes in the description like titles, prices, ingredients and so on.
  • Use Active Voice with Action Oriented Signals.

Google displayed no more than 155 characters of a page’s meta description in its listing. In 2017, Google increased the length of meta descriptions to 300 characters.

You can technically create meta descriptions of any length, but Google will truncate them if they are longer than 300 characters.

Consider writing your meta descriptions as an AD or a tweet.

Essentially, you are looking at writing a highly engaging tweet that could potentially be seen by millions of people and has to have the power to draw them in to learn more about what you are offering.

The core of a meta description centers on good salesmanship, as you are essentially creating an advertisement that is meant to appeal to potential customers.

Therefore, you want your meta description to clearly state your intentions in a way that combines direct marketing with some creative flair.

What is an H1 Tag?

An H1 tag is an element in HTML.

  • The acronym HTML stands for Hypertext Markup Language.
  • The term “tag” is defined as a small piece of code which directs your browser on how to show specific content.

In the HTML language itself, six heading tags are defined: H1, H2, H3, H4, H5 and H6.

Tags are ranked from most important to least important, with h1 being the most important, and h6 being the least important.

This helps the web page creator highlight the most important, and least important, sections of their web page.

To become more familiar with the h1 tag, a good exercise would be the following:

  • Open up any web page of your choosing
  • Take a look at the source code (Chrome (Windows): Ctrl + U, Chrome(Mac): CMD + Option + u)
  • When you view the source code, you’ll see many different character statements. Don’t be intimidated!
  • Press Ctrl + F to search the source code, and you’ll see a small search dialog box in your window’s upper-right hand corner.
  • Type “h1” then press Enter.
  • At this point, your web browser will highlight the h1 tag within this page’s source code.

Most likely, you will see a detailed snippet of code between the starting h1 tag and the ending h1 tag, but the tag is there nonetheless.

Creating the h1 tag is a simple enough process, however having a good quality h1 tag will have a large impact on your website.

Why is an H1 Tag Important?

An optimized h1 tag can help with your SEO.

Here are some important facts regarding the importance of an h1 tag:

  • H1 Tags have historically been an important ranking factor in Search Engine Optimization.
  • H1’s normally have a larger font size as they are the primary topic of the page.

Optimizing H1 tags is a very popular tactic to improve rankings for specific keywords.

In fact, around eighty percent of the top search results within the first page of Google searches use an h1 tag.

H1 best practices

Here are some best practices when using an H1

  • Use only one H1 on a page.
  • The topic of your page should be described through your h1 tag.
  • Your H1 should be the title of the page
  • Google uses H1 for search titles so you can use title case on H1 tags.
  • Match your title tag and your H1 if you can

Use Only one H1 on a page

H1 is the gist of the page. There is a great advantage to having only one clear topic on the page.

In a video, Google’s John Mueller says its ok to have multiple H1s on a page.

 

However, he clearly indicates that semantically marking up the page is always a good idea.

The W3C also offers a similiar advice on the topic. They indicate that

The first element of heading content in an element of sectioning content represents the heading for that section. Subsequent headings of equal or higher rank start new (implied) sections, headings of lower rank start implied subsections that are part of the previous one. In both cases, the element represents the heading of the implied section.

This means if you have multiple H1’s on a page, each start of H1 represent a seperate section.

This would normally not be required unless you write very lengthy content.

Your H1 should be the title of the page

As indicated by Google’s John Mueller, your page benefits from having one clear topic.

H1 is semantically considered to be the start of the document outline.

H1 represent the main topic of the page.

A title usually added above the body of the page also represent the same thing.

It is therefore prudent to have your title be the H1 of the page.

This is also consistent with advice that Google gives to news publishers.

A URL (Uniform Resource Locator), also known as a node or a web address specifies the location of pages. The URL consists of 5 parts

  • Scheme
  • Domain name
  • Path to file
  • Parameters
  • Anchor

An example ishttps://mojodojo.io/services/seo?isebook=no#homeIn the URL above Scheme:https:// Domain name: MojoDojo.com.au/mojodojo Path to file: seo Parameters: ?isebook=no Anchor: #home

ULRs should be shorter and concise. A URL alone should be enough to tell a user what the resource is about.

The part of the URL that identifies the page is called the URL slug. A url slug may have parent pages or categories directly above it.

In the following example, the word “seo” is the url slug.

mojodojo.io/services/seo/

This URL is also called a friendly URL. This is because it does not have any query parameters or dynamic text.

A friendly URL is also a recipe for higher type in traffic.

The following are some best practices with URL structure

  • It helps Google understand what the page is about
  • It helps the visitor know what the page is about.

Furthermore, a user is more likely to click links with human-readable slug.

The friendly URL slug serves as a check against the anchor text manipulation.

  • Keep your URLs short
  • Use a primary keyword in the URL slug
  • Avoid keyword stuffing the URL
  • Use a logical parent/child relationship
  • Avoid dynamic URL or query parameters
  • Make your URL available only on one version of the site (www vs non www)
  • Avoid stop words in the URL
  • Use canonicalization

All things being equal, short URLs have a higher CTR than longer URLs.

When all other factors are equal, a relevancy score can be won by having additional semantic value in the URL.

A page is usually broken down into 6 headings.

H1 being the most important and H6 being the least important.

H2 usually carry secondary topics or break the main topic into multiple branches.

H3 are used to further break down a specific H2 into more topics.

We use H2 & H3 to insert additional long tail keywords.

John Mueller confirms in a video.

Click-through rate is the number of times your listings got clicked divided by the number of times it showed. In SEO terminology, its clicks ÷ Impressions.

Organic CTR is clicks ÷ Impressions in Google organic listings.

Let’s look at a typical search listing from the search engine result pages (SERPs)..

A user on a Google search results sees

  • Title – The text in blue.
  • Description/Snippet – 2-3 lines text directly below the title.
  • URL – Immediately above the title.
  • Sitelinks – Enhances the results by adding supplemental links below the description text

Sitelinks only show some of the times, the listing comprises three immutable properties:- the title (blue text) , the URL & the description (2-3 lines of black text).

  • A high CTR is a very good indication of your listings appeal. A lower CTR demands that the listing be improved.
  • You don’t need to make your listings clickbaity to get a higher CTR. Some of the most mundane titles have a higher CTR because of the fact that they directly address the user’s search query. However, it is undeniable that the clickiness of a title is important to get the user to the page.

We now arrive at what is essentially a catch-22

  • In order to get the users to click through to your results you need a catchy title & description that may not be descriptive of the page. This may reduce its relevance to the topic.
  • In order to rank higher on search engines so that your users can see you, you will need descriptive title and description. This takes the “clickiness” out of the title and description.

So how do we fix this ?

We focus on the document structure of the page to improve its relevance but allow for some artistic freedom in choosing titles and descriptions. We fix any subsequent authority loss of the page by acquiring backlinks.

The body text found in <p> tags is generally the most important text on the page, as it is typically the largest and most specific text on the page.

Google usually values pages with more content higher than those with less content, as they are usually more detailed and interesting.

SEO requires creating content that is valuable to users.

However, there is no one size fits all rule when it comes to the amount of content on a page – some pages fare just as well with 500 words as others do with 1400.

In general, though, studies have shown that sites that are on the first page of Google results tend to have an average of 1400 words in their body text, so it is clear that Google favours pages with more detailed, quality content.

Search engines will follow internal links, guiding them to other pages that they may haven’t discovered.

They also use the text of that link (anchor text) to determine what that link is about.

Internal links are very useful in improving the authority of pages being linked to.

Number of internal links and quality of internal links both are used as a ranking signal.

Consider the rankings of Wikipedia.

Wikipedia shows up for any search terms on Google usually at number 3 or 4 positions.

It also occupies number #1 position for a lot of long tail keywords.

The number 1 reason for Wikipedia to be able to rank that high is internal links.

Wikipedia is perhaps the extreme enactment of internal linking.

Here is a page about SEO on Wikipedia. Note the number of internal links.

The best practices for internal links are

  • Link internally and link often.
  • Use the right anchor text for linking
  • Color the link appropriately to denote the difference

Google Ranking Signal FAQs

Latent Semantic indexing (LSI) was invented and patented in 1990 to help index small databases of documents.

The most common use of a ccTLD is to localize your business to a specific country. If you are a small business serving a specific country, you will benefit from using a ccTLD.

A broken links is usually a sign of an abandoned or neglected page.

It may also be a sign of an outdated page.

According to Google’s quality raters guideline, a broken link is a way to assess a pages quality.

A broken link is a link on your site that no longer works because the page it’s linked to has been moved or deleted.

Not only are broken links bad for user experience, but they can also hurt your SEO.

Broken links can cause a variety of problems for your website, including:
– Reduced traffic: If a user clicks on a broken link, they will be taken to an error page. This increases your bounce rate, which can hurt your SEO.
– Poor user experience: Broken links create a negative user experience, which can lead to users leaving your site without taking any action.
– Lost revenue: If you have an eCommerce website, broken links can prevent users from completing a purchase.

Google Search Console is a free tool that allows you to check for broken links on your website. To use Google Search Console, simply login and select the “Coverage” report from the left-hand sidebar.

Then, click on the “Error” tab to see a list of all the broken links on your website.

QDF ( query deserves freshness ) has been a part of Google’s algorithm since 2010.

If at the start of the new year, you update the content on your website, you should show a “modified date” timestamp instead of the “published date”.

This serves two purposes

  • It helps search engines understand that the content has been updated.
  • It helps you rank better as now your content has been recently updated (freshness).

SEOs will abuse this often.

You can note this by seeing the year (2022 or 2023) used in the title of the guide.

While Google has smartened up to this abuse, it still works in 2022.

Google does take a diff of the page though.

So if you are going to do this, make sure you also update a portion of the content.

You should also use this if your user is explicitly searching for a year in their query.

Your website should be mobile first.

This means, you should design for a mobile phone before you design your desktop version.

Mobile phones have a smaller viewport and can often result in a skimmed version of your main page.

It is likely that you may be in a handful of industries where users use your desktop site more than mobile.

You would notice that your mobile traffic continues to increase and will do so in the future.

Even Google has switched to a mobile crawler in the past few years.

You can confirm this via the Google search console.
google search console

Check the viewport meta tag
The viewport meta tag tells the browser how to scale a page to fit on a mobile device. Without this tag, your pages will not be responsive and will not look good on mobile devices.

Check the width of your content
Make sure that your content is not too wide for mobile devices. If your content is too wide, it will require users to scroll horizontally to see all of the information, which is not a good user experience.

Check the font size
Make sure that your font size is large enough to be legible on mobile devices. Small font sizes can be difficult to read on small screens.

Check for touch-friendly elements
Make sure that your buttons and links are large enough to be easily clicked on mobile devices. Small buttons and links can be difficult to click on a touchscreen.

Check your images
Make sure that your images are optimized for mobile devices. Large images can take a long time to load on mobile networks.

Check your videos
Make sure that your videos are optimized for mobile devices. Videos that are not optimized for mobile devices can be slow to load and may not play properly on some devices.

Test, test, test!
The best way to ensure that your site is responsive is to test it on as many different devices as possible. Try testing on different smartphones, tablets, and desktop browsers to get a comprehensive understanding of how your site looks and functions on different screen sizes

Visual search engine technology has improved.

Media plays a role in enhancing your page and has corelation to higher rankings.

Search engine crawlers have come a long way, and while images can be identified, crawlers still provide preference to keywords in an alt tag.

The text associated with the image helps the crawler relate it to the rest of the content of your website.

What is image optimization?

Image optimization is the process of reducing or compresing your images without sacrificing the quality of the images.

Image optimization is important because it ensures that your pages load faster. Faster page loads lead to better user experience.

The goal of image optimization is to ultimately help your page load faster and rank better.

There are multiple aspects that play a role in image optimization

  1. Image size
  2. Image compressions levels
  3. Image aspect ratio, height and width
  4. Image embedding and scaling

Images are usually the heaviest part of any webpage.

The page load is measured as the total time it takes to load all parts of the webpage.

Put simply, if it’s an illustration, use PNG, if it’s a photo, use JPEG. Output the image at the right size.

Oftentimes, I see scaled images on websites shown in 300PX with the actual image being 2000PX or more.

Some folks argue that WEBP is better than JPEGs. Jyrki Alakuijala, one of the creators of WebP, on WebP vs JPEG:

  • For high quality photography, I (and butteraugli) believe that JPEG is actually better than WebP.
  • Below JPEG quality 77 WebP lossy wins, above JPEG quality 77 JPEG wins (for photography).
  • This was based on the maximum compression artifact in an image — averaging from 1000 images.

However, in defense of WEBP, they also support transparency and animation.

From Chrome 76 and Firefox 74, you can use the loading attribute to lazy-load images without the need to write custom lazy-loading code or use a separate JavaScript library.

Resize your images

Consider the guidelines laid by Google for page loads

  • Most visitors will bounce if your website takes more than 3 seconds to load. On mobile devices thats 2 seconds.
  • Top million sites have improved their speed dramatically in the last decade.
  • Google uses page load as a ranking signal.

There are plenty of tools in the market both paid and free that will offer image compression.

  • Here is a great image resizer that uses your own browser to do the compression.
  • Obviously, there are other options like GIMP on linux that will also do the job well.
  • Photopea is another excellent in browser tool that is free.

Lazy load images

Use the loading attribute to completely defer the loading of offscreen images that can be reached by scrolling:

<img src=”image.png” loading=”lazy” alt=”…” width=”200″ height=”200″>

Use descriptive and concise titles

Descriptive, keyword-filled file names are important to helping search engine crawlers understand what’s in the photo.

This borrows from the concept of friendly URLs.

If the crawler can understand the subject matter of the image, it can quickly determine how relevant your content is to certain queries being searched by users.

Your filename may be “IMG_722011.JPEG,” so change your image filenames to reflect what they are about.

For example, an image about a wedding in Melbourne on a beach might look like “Melbourne_Beach_Wedding_2018” when it is renamed. This helps provide more SEO value to your images inherently.

You wouldn’t want to use “wedding-1.jpg” or “wedding-2018” as these are generic.

Image Optimization Best Practices

Best practice for image optimization are

  • Name your images in plain language
  • Use keywords in description and titles
  • Optimize alt attributes to have an impact, don’t keyword stuff
  • Choose image dimensions wisely to fit the element on your page
  • Reduce the size of all images to make your page load faster
  • Pick the right file types like JPEG for better quality
  • Improve JPEG quality using web and devices format
  • Optimize your thumbnail images for social media sites using OGP
  • Use image sitemaps to better inform search engine crawlers
  • Don’t place multiple images on a page or blog post unless each provides impact
  • Test your image optimization through sites like Screaming Frog and Xenu
  • Lazy load your images and videos

At a Google IO conference, a session titled “How to Stand Out in Search with Structured Data” revealed that in several case studies, the sites showed up first because of their schema data implementation.

In fact, some interesting numbers that were shown are

  • 25 percent higher CTR on pages with markup.
  • 35 percent increase in visits for recipes with markup.
  • 1.5 percent increase in more time spent on pages and 3.6x higher interaction rates.
  • 82 percent increase in CTR for rich snippet results.
  • 20 percent increase in clicks for pages with schema markup.

Schema markup also called structured data is essentially a form of microdata embedded into a webpage.

It helps Google and other search engines better understand attributes of the object the page represents.

An example would be a product page that may have attributes like title,description,price,SKU number, color, size and so on..

You can find all available types of schema at Schema.org.

How to add schema is beyond the scope of this guide.

You can use an extension if you are using a popular CMS like WordPress.

It is highly recommended that you identify any type of pages that you can markup with schema.

Implementing schema is a very high ROI activity for the purposes of SEO.

Duplicate content or identical content can lead to decline in search rankings.

When two or more pages on a website contain substantially the same content, it can cause problems with search engine crawlers and indexing.

This can lead to lower rankings and visibility for those pages, as well as potential penalties from Google.

There are a number of ways to avoid duplicate content penalties, and understanding the causes of duplicate content is essential for avoiding them.

  • One of the most common causes of duplicate content is inadvertently publishing the same article on multiple pages of a website.
  • Another common cause of duplicate content is using multiple URLs to refer to the same page. This can happen when a website changes its domain name or uses multiple domain names, or when pages are moved without properly redirecting the old URLs to the new ones.
  • There are a number of other causes of duplicate content, such as printer-friendly versions of pages, session IDs in URLs, and pagination.

Canonical URLs are used to identify the original version of a piece of content.

When you use canonical tags on your pages, you tell search engines which version is the original and which should be indexed and ranked.

Outbound links have shown to have a positive effect on SEO.

An industry study found that outbound links had a positive correlation with its search ranking.

Google also looks at general theme and neighbourhood of your outbound links to determine relevance.

Outbound links have long been thought to be a positive signal for SEO.

A study of the top 10,000 websites by Moz found that outbound links were correlated with higher Google rankings.

The reasoning behind this is that outbound links show that your website is an authority on a topic.

If you’re linking to other websites, it means that you’re confident enough in your own content to send your users to other places.

Google also looks at the general theme and neighbourhood of your outbound links to determine relevancy.

So if you’re linking to websites that are related to your industry, it shows that you’re an authoritative source of information.

In short, outbound links are a positive signal for SEO and should be included as part of your overall linkbuilding strategy.

You should create a sitemap of your site. This is usually in XML format.

A sitemap is a file where you can list the web pages of your site to tell Google and other search engines about the organization of your site content.

Search engine web crawlers like Googlebot read this file to more intelligently crawl your site.

A sitemap is especially helpful if your site has dynamic content or an extensive archive of pages that are not well linked together.

Creating and submitting a sitemap helps make sure that Google knows about all the pages on your site, including URLs that may not be discoverable by Google’s normal crawling process.

In addition, a sitemap can provide valuable information about when a page was last updated, how often it changes, and how important it is in relation to other pages on your site.

All of this information helps Google index your site more effectively and improve your google ranking.

While most popular CMS platforms (wordpressshopify) will create a sitemap for you automatically, you can also generate one using sitemap extensions or tools.

Once you have created your sitemap, you can submit it to Google via the Google search console for improved crawling and indexing.

Generally speaking, sitemaps are used to discover new pages on your site, but they can also be useful for keeping track of existing pages.

For example, by including priority and last-modified information in your sitemap, you can help ensure that search engines are indexing your most important and up-to-date content first.

Overall, sitemaps are an important part of SEO and should be included as part of any comprehensive optimization strategy.

Over the years, there has been much debate about the role of domain trust in SEO.

Some believe that it is a significant ranking factor, while others contend that it is relatively unimportant.

So, what is the truth?

Domain trust is a measure of how much Google trusts a domain.

This trust is based on a number of factors, including domain age, number of backlinks, and the quality of the content.

Generally speaking, the longer a domain has been around, and the more high-quality links it has, the higher its domain trust will be.

A Google patent titled “search result ranking based on trust” backs up the claim about domain trust being a ranking signal.

The use of keywords in a domain name used to be a key factor in boosting SEO, but that is no longer the case. However, keywords in a domain name are still seen as a relevancy signal by search engines. This means that having keywords in your domain name can still be beneficial, even though it will not have the same impact as it once did. When choosing a domain name, look for one that is relevant to your business and contains keywords that you want to rank for. In addition, try to choose a short and simple domain name that is easy to remember and type.

In 2019, John Mueller confirmed that Google uses HTTPS as a light-weight ranking factor.

With the advent of Letsencrypt, SSL certificates are free.

There is no reason to have a website without HTTPS.

It has long been the case that sites lacking HTTPS encryption were at a ranking disadvantage compared to sites with HTTPS encryption.

That disadvantage is now exponentially greater under the current iteration of Google’s search engine algorithm, as Google Chrome has begun marking any site lacking HTTPS encryption as unsafe.

Obviously, a site marked as unsafe is unlikely to perform well in search engine rankings.

More than 90 percent of shoppers have said that their decision on whether or not to purchase something from a business is influenced by online reviews.

According to a recent survey, more than 80 percent of consumers stated that they trusted online reviews of a business exactly as much as they would trust a personal recommendation from either their friends or family.

Google agreed that being bad to customers is bad for business.

An even higher percentage of those online customers use those reviews to make informed decisions on what they want to buy.

If you run Google ads or Facebook ads, chances are you can leverage these reviews to improve your conversion rate.

If a customer is complaining about bad customer service, a lousy return policy or bad wait times, you should take that criticism and use it as a tool to help you improve your business and address the root of the issue.