Comparison of Alarm and Screaming Frog Website

Screaming Frog is a website analysis tool. Its task is to collect and help analyze various information about the SEO site to specialists. Generated reports are more background information, food for thought, than a troubleshooting tool.

Site Alarm is a site audit tool that finds errors and gives clear recommendations for correction. Each line of the report is a found problem that should be fixed.

Next, we will list those functions of the Screaming Frog program that relate specifically to site audit, not analysis.

Alarm websiteScreaming Frog
Broken links
Broken links and server errors
Redirects
permanent, temporary, JavaScript redirects and metadata updates.
Blocked URLs
Blocked resources
View and audit blocked resources in rendering mode.
External links
View all external links, their status codes and source pages.
Security
Detect insecure pages, insecure file upload connections, insecure forms, missing security headers.
Problems with URLs
Uppercase characters or long URLs.
Page titles
Missing or multiple header tags.
Meta description
Missing, duplicate, long, short, or multiple descriptions.
File size
The size of URLs and images.
Response time
Checking how long it takes pages to respond to requests.
Expires header
H1
Missing, duplicate, long, short, or multiple headers.
Meta tag [http-equiv=”refresh”]
including landing page and time delay.
Redirection chains
Discover redirection chains and loops.
hreflang attributes
checking for missing supporting links, inconsistent and incorrect language codes, and non-canonical hreflang.
Rendering
Parsing dynamically generated links by JavaScript frameworks such as AngularJS and React.
Images
The alt attribute is missing.
Analysis of Sitemap
Availability, correctness, and search for broken pages.
Structured data and validation
Checking structured data for compliance Schema.org technical specifications and Google search functions.
Spelling and grammar of
Headings and meta tags.
Checking AMP

The list is missing:

  • Duplication or suboptimal length of the H1 tag - this tag has lost its significance due to the development of search engine algorithms. Now such checks are of little relevance.
  • Problems with URLs: non–ASCII characters, underscores, parameters — these checks are harmful, since such URLs are correct.
  • Accurate detection and close to duplicate pages using advanced algorithmic checks are analytical data designed to facilitate the search for duplicate pages. But whatever the algorithm is, it will cause a lot of false positive and false negative positives, making such a check almost useless.
  • keywords meta tag — this meta tag has not been relevant for a long time.
  • Checking H2 headers is a redundant and useless check.
  • Meta tags for robots: index, noindex, follow, nofollow, noarchive, nosnippet, as well as the title X-Robots-Tag - search engines have long learned to do without these instructions when working. And you need to close pages from indexing through robots.txt
  • Search for missing pages in the sitemap - the site map should not contain all the pages of the site, but only those that need to be indexed by search engines. Automatically adding all pages to this file is harmful.
  • Complete spelling and grammar check — checking the grammar of pages is unprofitable, as it causes a huge number of false positive positives. Browsing a huge list is time-consuming. All texts on the site are edited by a person and office programs with a built-in grammar check function before publication.
We use cookies. By continuing to use the site, you agree to the processing of personal data in accordance with privacy policy. I agree