Website Launch Checklist & Quality Assurance (QA) Process

There are just shy of 400 new websites being launched on the internet every minute of the day, but just think how many of them actually go through a proper website launch checklist. Taking the time to run through a solid quality assurance (QA) process and launch checklist might be all it takes to be one step in front of your competition.

According to a study released early in 2019, 59% of small businesses in Australia alone are yet to have a website. It’s now a year later, and the world of business is moving at a desperate pace to get online and – more importantly – to stand out.

This article isn’t about what makes a good website, but more to help you verify what you already have, or are about to launch, has gone through some basic quality assurance checks to ensure the website is in good condition and will perform ok out of the box from day one. Consider this website launch checklist a process similar to checking your tyre pressure, oil and lights on your car before you depart on a road trip.

Visual and functional QA checks

Start with the visual and functional inspections as these are the most obvious and will typically be what your potential visitors will also see.

As per design or brief

The website is to match the brief and the design of the website that you signed off on. If no mobile device designs were mocked up in the planning phase, then make a visual judgement or clarify any choices with your designer.

Browser compatibility

Test against the latest mainstream browsers and ensure compatibility on the mobile and desktop browsers listed below.  It’s also useful to test on a variety of mobile devices.

Web browsers

  • IE 11+ (Lower versions of IE can be supported where applicable)
  • Microsoft Edge (Latest stable Version)
  • Firefox (Latest stable Version)
  • Safari (Latest stable Version)
  • Google Chrome (Latest stable version)

Mobile devices

  • iOS 11+ (iPhone, iPad)
  • Android 5.1+

Images types & quality

The correct type of image has been used for the right purpose.

These three main options are SVG, PNG & JPG. You may also see others on the rise such as WebP, however its web browser compatibility is still fairly limited. WebP isn’t a format you’ll have to save your images as typically it’s an image format that your content management system (CMS) can convert and serve up automatically as an alternative for the supported browsers.

When you’re extracting assets or selecting images to upload to your website, keep in mind these three guidelines:

  • JPG: Photographs or images with many gradients.
  • PNG: Complex illustrations, anything with transparency or fine lining (including text if necessary).
  • SVG: Icons, simple vector shapes or logos.  These can also be embedded into the page to reduce page requests.

Testing information and placeholders

During the website build and development process you end up with lots of placeholder assets and test data on the website, so it’s important to make sure it’s all replaced or deleted before launching. Items to look out for consist of the following.

  • Placeholder text (typically lorem ipsum)
  • Placeholder images
  • Test form submissions
  • Test users/member signups
  • Test email subscriptions
  • Example pages
  • Example blog posts

Favicon

A custom favicon should be created to match the logo and is displaying correctly for your users.

Links

All links are to have hover states. Typically the colour of the text changes on hover to allow the user to identify that it’s a clickable link.  It’s important that this link has enough contrast to be legible, and also matches your site’s branding.

Fonts

Correct fonts are used as per the original design and confirm that any commercial font has been properly licensed.

Non-standard pages

  • 404 error pages have been styled to match the site design.
    You can even make these pages fun and engaging for the user. Best practice is to include a search input box on the page to help get the user to where they need to be.
  • Search results page has been styled to match the site design.
    Again you can take advantage of these pages to recommend related high value content, offers or products that typically attract your ideal customers.

Content

Pages have clear heading tag hierarchy and no spelling or grammatical mistakes.  A well planned hierarchy will make your site more accessible for users accessing the page via a screen reader or other accessibility tools.  It’ll also be a bit clearer for any search engine robots that come across the page.

Technical website checks

To perform technical website audits, Screaming Frog is a great tool for manual technical checks, but if you’re looking to be efficient and instantly identify quick wins, then SEMrush provides an automated website auditing tool and comprehensive marketing suite. These tools crawl over your website just like the Google search engine web crawler bot and allow you to audit and analyse the frontend technical elements that make up your website. This tool will get you through your website launch checklist in a matter of minutes, rather than hours.

No placeholder links found

Search your site’s code for href=”#” to ensure there are no internal sitelinks that haven’t yet been populated.

Email addresses are encoded

To avoid automated internet crawlers and spam bots from collecting your email contact data from your site, it’s best to encode them so that these bots can’t crawl your website code looking for the @ symbol and use it for any nefarious purpose.

All hyperlinks lead to correct pages

Website page URLs to use either a trailing slash “/” or no trailing slash (NOT a mixture of both). It has to be one or the other to avoid duplicate content and incorrect indexing.

Some sites solve this issue with canonical link tags, or a redirect – just be sure it’s consistent.

Valid heading structure

Valid H1, H2, H3 heading structure. Each <section>, <article>, <header> etc. element is able to have it’s own hierarchy of h1, h2, h3 tags.  This allows for multiple levels of importance to be given to each part of a section.

Meta title format

USE – (hyphen) NOT | (Pipe)

This is a personal preference, but hyphens look cleaner in search engine results pages such as Google so we recommend going with hyphens in our default title format.

Meta title content

By default meta titles should automatically match the name of the page, unless a custom title has been entered into the SEO title field in the CMS.

Meta keywords

The keywords meta field is to be content manageable in the CMS. Don’t have a global default keyword set. This is to ensure you don’t end up with duplicated meta keywords across every page.

Meta description

The descriptions meta field is to be content manageable in the CMS. Don’t have a global default description set. Again this is to ensure you don’t end up with the same meta description on every single page.

Image alt text

All images to have an image alt text tag and custom description entered into each image. The image alt text is to be content manageable via the CMS.  Alt text is also helpful for any users using accessibility tools.

Tip: Keep the image alt text under 100 Characters.

Image sizes acceptable

Image sizes are to be within the <100kb range where possible and the CMS to automatically compress images on upload.

Tip: Both PNG and JPG can usually be compressed to a much smaller size than is available out of Photoshop or other photo editing software. We recommend using TinyPNG.com or Compressor.io to do this.  In certain cases, reductions of up to 75% can be achieved. This is obviously a huge benefit for web page speed & SEO.

Internal linking issues

  • No 40X or 50X errors found on any internal or external link. These are bad for user experience and also consume the crawl budget of SEO search engine robots.
  • No internal links to 301 redirects found. If a 301 redirect is found on an internal link on the website, it should be updated to the destination of the redirect. You want to avoid redirect hops for both speed conservation and also to preserve the crawl budget of SEO search engine robots.

Fonts

Fonts have been hosted locally in the website files where possible to avoid DNS lookups and additional network response time. This helps towards your site’s overall speed optimisation.

Forms

  • All forms to redirect users to a dedicated ‘Thank You’ page for each form upon submission.
  • All forms to have in page JavaScript validation to ensure required fields are entered, and errors are displayed on-page in a usable fashion.
  • Email subscribe and opt-ins tested to ensure email addresses are added to the chosen Mailing List, CRM or EDM Software.

Thank you pages

Thank you pages have a noindex meta tag ie: <meta name=”robots” content=”noindex”>.

This is to prevent the thank you page from appearing in search engines, and in turn have conversion codes or Google Analytics fired if they’re visited directly.

Tracking codes

Relevant tracking codes have been implemented on the website. This will likely include Google Analytics/Tag Manager and Facebook Pixel.

Google and Facebook both offer extension for Chrome to help validate your tracking. Check out Google tag assistant and Facebook pixel helper.

Phone numbers

All phone numbers should be coded with a tel link (href=tel) to ensure that mobile users can simply tap on it to call.

Social profiles

Links on social profile icons are to open up in a new browser tab. This ensures that the user isn’t completely taken away from the website. Having the website tab still open allows an easy path back whilst they’re browsing your social profiles.

HTTPS / SSL certificates

Most sites now connect using https:// rather than http:// – this means that the site is secured, using an SSL certificate.

Tip: High grade SSL certificates are now free on the Let’s Encrypt project, so there are no excuses!

WWW or Non-WWW

The website should be configured with only a single variant. Either using the WWW subdomain or without so it’s Non-WWW. The variant that isn’t used is to have a 301 redirect back to the chosen variant. This avoids duplicate content and search engine indexing issues.  As with the trailing slash point above,  consistency is key.

Minification

JavaScript and CSS is to be minified to improve page speed performance. This can either be achieved within the frontend development workflow, the CMS, or content delivery networks (CDN) if one is being used.

Microdata

Basic structured data and open graph built into web page code.

These code snippets are identifiers for Google and Facebook etc. and in some cases depending on the chosen CMS will need to be content manageable.

This microdata could be as simple as a company name, URL and logo or as advanced as an itemscope to designate the name of a product, restaurant, cinema movie times or any number of other examples.  There’s too many to list here, so I recommend that you check over the schema.org list of schema to determine what’s suitable for your website. Ideally, this is something that you put into your website brief and agree upon in your website scope because it can get complex beyond the basics.

Feeds

  • RSS feeds configured (Blogs and products are typical use cases).
  • Sitemaps configured (Pages, posts and products are typical use cases).

These URLs help search engines learn about new content and RSS readers to easily convert the content into their published format. You will also want to submit these feeds to both Google Search Console and Bing Webmaster Tools for increased search engine exposure.

Robots permitted

Check that robots are permitted to view the site. When your website is still in development it should be denying robots from both crawling the site, and indexing it. The purpose of this is so that your development or staging site doesn’t end up in the Google search results prematurely.

There’s two primary ways of denying robots:

  1. noindex meta tag on all pages <meta name=”robots” content=”noindex”>.
  2. Disallow rule (Disallow: /) is in a robots.txt file such as thisexamplewebsite.com/robots.txt

When you go live or are live check that:

<meta name=”robots” content=”noindex”> changes to <meta name=”robots” content=”all”>

and

Disallow: / changes to Allow: /

Automated testing and validations tools

If you’re not experienced in web development, the tools listed below are good to use as a basic guide, however certain elements and behaviour in how the website works can create unconstructive and subjective reporting in these generalist testing tools. A great example of this is that all your JavaScript code would ideally be hosted locally on your website. If you’re using Google Analytics or your website is on Shopify then this will score you down straight off the bat.

A short list of tools you may want to run your site through:

Evaluate page speed performance (Aim to have pages load in under 3s).
WCAG accessibility compliance is validated if the website requires this level of accessibility (typically a requirement of Government projects).
RSS passes feed validator.
Structured data visible in Google’s testing tool.

All-in-one automated auditing tool, use SEMrush.

Does your website pass our website launch checklist?

Read some more interesting stuff.