You know the struggle… you just need these four or five tickets taken care of and it would mean so much to your SEO targets for the month.
But how can you get your web developers on board?
How can you assist them grasp the gravity of your SEO expectations when they have so many other competing worries on their plate?
Fifteen years ago, I could manage around 90 percent of my SEO job for a given company alone.
Those days are gone. SEO currently depends on content production, UX, code development, IT, multiple layers/levels of approvals, and more.
I have written many times about how SEO can’t be done in a silo and am happy it’s a discipline that now focuses more on alignment for providing a great experience for website visitors.
Over my career, there has always been a requirement for the services of web developers.
That meant walking down the hall at my agency or interacting with a third-party developer contracted or retained by my clients.
In every circumstance, having buy-in and collaboration from web development is vital for SEO.
Even better is when developers have an understanding of SEO basics.
It is substantially more effective if developers know the fundamentals and implement them into their builds and site maintenance, avoiding any re-work later.
Check out the 10 must-know SEO basics for web developers and some focus group chats with my teams of SEO specialists and developers as well.
Security \sWebsite security matters to the search engines.
Make sure you have an SSL in place and without any difficulties.
That’s the initial point.
Beyond that, have the needed safeguards to assure the site has no vulnerabilities that allow for an injection, updated information, etc.
Getting hacked at any level hurts user experience and trust signals for users and search engines.
However, be cautious of site speed (more to come on that) when you safeguard the site with any plugins, extensions, or tools.
Response Codes \sServer response codes important.
Often there are approaches to get a page to render for a user and unique UX designs that necessitate some innovative dev solutions.
Regardless, make sure pages are displaying 200 server codes.
Source and update any 3xx or 4xx codes. If you don’t need redirects, eliminate them.
Speaking about redirects, they are a significant component of the website migration and launch process coming from an old site to a new one.
If you don’t do anything else in your launch method, at least set redirects.
We’re talking about ensuring sure all URLs from the old site have a 301 redirect to the most relevant subject matter page on the new site.
This may be 1:1 old site to new site pages or many to one if you are streamlining and adjusting content structure.
Just like with the server codes mentioned above, you shouldn’t assume that since a page is rendering that everything is well.
Make use of the tools available to verify that redirection are 301s.
The Robots.txt file
Nothing matters in terms of search engine optimization (SEO) if the website in question cannot be indexed and shown in search results.
It is imperative that the robots.txt file not be treated as an afterthought.
There are instances when the default commands are excessively liberal, and other times when they are too restrictive.
Be familiar with the contents of the robots.txt file.
Do not send the file from staging to production without first doing an inspection of it.
A prohibit all command from staging, which was intended to prevent the development site from being indexed, was inadvertently pushed to the live site, which led to the failure of many websites that had wonderful migration and launch preparations.
Additionally, you should think about deactivating low-value stuff like as tag pages, comment pages, and any other versions that your CMS provides.
You will typically be required to take into account a large quantity of low-value garbage, and if you can’t stop the pages from creating, at the very least prevent the search engine from indexing them.
XML sitemaps provide us with the opportunity to ensure that search engines are aware of each of the pages on our website.
Do not waste resources and opportunities by allowing images, insignificant pages, and stuff that shouldn’t be targeted for attention and indexing to be indexed.
Make certain that all pages that are listed in XML sitemaps return a value of 200 from the server.
Maintain a clean and clutter-free environment by removing any 404s, redirects, and other pages that aren’t the target page.
Uniform Resource Locators (URLs)
A good URL should be short, contain terms that are pertinent to the page’s subject matter, be written in lower case, and not contain any characters, spaces, or underscores.
I get really excited when I find a site that has a navigation and site structure that mirrors the content hierarchy in its URL structure with subfolders and pages.
Which level is three below?
The following URL should be used: “example.com/level-1/level-2/topical-page.”
Again, keep in mind that just because something functions properly or appears to be in good shape in a browser does not automatically mean that it is optimised for a search engine.
Search functionality must be optimised for mobile devices.
Use the mobile-friendly test tool provided by Google to verify it.
Make sure it passes.
In addition to that, you should think about the content that is provided in the mobile version.
Google employs an indexing strategy known as “mobile first.”
This leads one to believe that they are utilising the mobile version of the website.
Think carefully and be aware that the content may not be present in what Google sees if you choose to hide or not display important information in the mobile version for the sake of user experience. If you do this, you should be aware that search engines will not be able to index the information.
This is the eighth item on the list, but after making sure that your website can be indexed, it’s probably the most important.
Site speed is vital A poor user experience and low conversion rates are the result of slow-loading pages and websites.
In addition, they have an effect on how well SEO is performed.
There is no one strategy that can be used to improve the overall performance of a website.
It really comes down to maintaining a code base that is lean, exercising moderation in the use of plugins and extensions, ensuring that your hosting environment is optimised, compressing and minifying JS and CSS, and maintaining an appropriate level of control over the file sizes of your images.
A risk is represented by any code, files, or components that have the potential to bring about changes in performance or instability.
Develop any possible safeguards for the content management rules, so that a picture of 10 megabytes or more cannot be uploaded and make a page unusable. Or, an update to a plugin is installed without anyone noticing how much it slows things down.
Establish a baseline, continuously monitor it, and strive to make the site faster.
The tool that is most popular with my Lead Developer is web.dev, which can be found in the developer tools for the Google Chrome browser.
Information about the context provided by heading tags is extremely important for search engines.Remember that they should only be used for content, not shortcuts in CSS.
Bind your CSS to them, but make sure to keep them in the order of their relative importance.It is not appropriate to have the first and largest heading on a page be an H5 and the subheadings on a page be H1s.
There is a lot of controversy surrounding the effect that headers have (or do not have) on the performance of SEO.
In this particular piece, I will not be going there.
Simply try to be as precise as you can in describing the hierarchy and how people are used in it.
When possible, use them in place of various other CSS.
Have just one H1 on a page if you can.
Collaborate with your SEO staff to gain an understanding of the overall plan for headers and the content that appears on the page.
Content Management & Dynamic Content
As was mentioned earlier, the functionality of the CMS can be detrimental to the best development solutions.
Be as judicious as possible with the control you offer.
To ensure that content creators have the control they want and need without jeopardising site speed or any of the on-page SEO aspects, it is important to have an understanding of the site’s ongoing content plan and demands.
You might be able to save time by having as many dynamic features as possible, such as tagging, the creation of XML sitemaps, redirects, and others. This would also help to secure your website and keep your code stable.
The convergence and cooperation of those who practise SEO and those who develop websites are absolutely necessary.
Search engine optimization is dependent on best practises for technical SEO as well as other aspects such as enterprise scaling of on-page elements.
When it comes to good cooperation and SEO performance, having developers who are familiar with SEO principles can go a long way.
In addition, it has the potential to make the work of creating a website more efficient and reduce the amount of rework or “SEO-specific” adjustments and demands that are necessary.