Page load speed, among other Core Web Vitals, is a known Google organic ranking factor. While we have the PageSpeed Insights tool, it unfortunately only works on one page at a time.
The Page Timings report in Universal Analytics surfaced specific pages on your site that were slowest, allowing you to prioritize which pages to evaluate and optimize.
The tool was particularly helpful if you have a large site with thousands of pages to analyze. Armed with the list of problem URLs, you could then prioritize pages for review using the PageSpeed Insights tool.
But Google didn’t include the Page Timings report in GA4, so where can you find similar information now?
Below are several free and paid tools that can help you pinpoint your problem pages and prioritize their optimization.
1. Google Search Console
- Pros: Free.
- Cons: Highly manual, no API connections.
Google Search Console (GSC) provides a Core Web Vitals report and even separates the data by mobile versus desktop.
However, while GSC provides some examples of URLs affected, it doesn’t provide a full list. Instead, it groups pages together and shows examples from the group. The data is also not easily downloadable to a CSV for monitoring.
If your goal is regular monitoring, you’ll need to log in to GSC and review the data within the tool. The GSC API does not support exporting core web vitals report data, so you can’t pull GSC data into Looker Studio or other visualization tools.
2. Screaming Frog
- Pros: Thoroughly indexes sites, connects to PageSpeed Insights API (with a key you provide), scheduling available.
- Cons: Paid tool, desktop-based.
A long-time favorite of SEO professionals, Screaming Frog software has many helpful SEO applications, but the most important thing for this article is that it provides page load times.
It can further be connected to the PageSpeed Insights tool using a key from the PageSpeed Insights API to import Core Web Vitals data directly into the PageSpeed report:
The only real drawback to Screaming Frog is that because it’s a desktop-based application, the computer you host it on has to be turned on and connected to the web when the report runs. This makes the tool less optimal for dashboarding and highly regular data monitoring.
One workaround is to have a desktop computer that is always turned on. I did this in my agency for many years with a dedicated, old desktop computer running Screaming Frog.
Because the tool allows for scheduling, the scheduled report can run at the appointed time as long as the computer is on and connected to the internet. Additionally, you can connect Screaming Frog to a Google Drive account and export the report tabs to Google Sheets:
If you want to use the upload for dashboarding, choose the Overwrite files in output, which will allow you to just update the same Google Sheet.
Once the data is in a Google Sheet, you can import it into other platforms, such as Looker Studio, to create dashboards and visualizations or create thresholds to send email alerts using Apps Script.
Get the daily newsletter search marketers rely on.
3. Ahrefs
- Pros: Thoroughly indexes sites, scheduling available, cloud-based application, connects to the PageSpeed Insights API (with a key you provide).
- Cons: Paid tool, manual data export.
Ahrefs has long been an SEO favorite for tracking backlinks, but the tool also has a robust site audit tool that tracks page load speed as it indexes a website.
Like Screaming Frog, you can connect PageSpeed Insights directly to the site audit to see specific core web vitals optimizations that should be made:
While you can export reports to Google Sheets, it’s a manual process. Site audits can be scheduled for regular intervals.
Unfortunately, the Ahrefs API doesn’t appear to have a way to automatically export the results, leaving it a bit of a manual process and less than ideal for dashboarding and near real-time reports.
4. Semrush
- Pros: Thoroughly indexes sites, scheduling available, cloud-based application, connects to the PageSpeed Insights API (no key needed).
- Cons: Paid tool, manual data export.
Another popular SEO tool is Semrush, and it also has a site audit feature that reviews page load speed and lists the pages with the longest load times:
Unlike Ahrefs and Screaming Frog, you aren’t required to enter a personal PageSpeed Insights API key to connect core web vitals optimization information directly to the audit.
Again, with this tool, however, the data export for this report is manual. Semrush has an API, though, and it will report on page load speed issues. However, the API is only available for business plans and higher, which start at $499/month.
5. Add page speed into GA4 using custom dimensions
- Pros: Free, measures actual user data for page load speed by page, scheduling not required, cloud-based application.
- Cons: Only begins tracking data once implemented (no historical data), doesn’t automatically connect with PageSpeed Insights API.
Another option to restore page load speed in Google Analytics is to create a custom dimension. You can use that custom dimension to create an Explorations report, import data into Looker Studio or export data using the GA4 API or various tools that incorporate the API.
Measure School has an excellent tutorial on how to track page load speed using Google Tag Manager and custom dimensions in GA4.
Multiple free and paid tools can export your list of slow pages using the custom dimension to Google Sheets, including the free Google Sheets extension GA4 Reports Builder for Google Sheets.
Unlike its predecessor in Universal Analytics, this extension does not have scheduling capability. I personally use Supermetrics, which is a paid tool but provides me access to multiple APIs, including GA4, and allows me to schedule reports.
Connecting with the PageSpeed Insights API
Once you have your list of the site’s slowest pages, though, you’re not completely finished! Screaming Frog, Ahrefs and Semrush pull Core Web Vitals optimizations into their platforms using the PageSpeed Insights API.
If you’re not using one of those tools, you’ll either need to interrogate each URL in the PageSpeed Insights tool manually, one by one, or you can also use the PageSpeed Insights API to make those queries for you.
If you’re not a web developer or skilled with coding, there are fortunately tools that you can use to tap into APIs, including the PageSpeed Insights API, to get the specific core web vitals details you need for optimization.
My personal favorite is Zapier, which has a webhook zap allowing even non-developers a simplified way to connect your list of slow URLs to the PageSpeed Insights and pull in whichever data points are most important:
Optimizing images can often be a quick way to improve page load speed. In the zap example above, I only pull in image details for each URL for a site with over 10,000 pages. This allows me a fast way to find:
- Which pages are slowest.
- Of those pages, which ones are slow due to images that should be resized.
- Make a list of the images that should be resized and prioritize them by greatest load time saved per image.
The benefit of this approach is that it truly can provide near-real-time reporting and dashboarding, whereas the other solutions still have drawbacks that make them less than ideal for dashboard reports.
However, you continue to measure page load speed for organic search optimization, each solution requires some set-up and work. So, if you haven’t already started on a solution, get started immediately so that you can quickly mine quick wins for SEO and improve your problem pages.
Opinions expressed in this article are those of the guest author and not necessarily Search Engine Land. Staff authors are listed here.