SEO
Bulk Loading Performance Tests With PageSpeed Insights API & Python
Google offers PageSpeed Insights API to help SEO pros and developers by mixing real-world data with simulation data, providing load performance timing data related to web pages.
The difference between the Google PageSpeed Insights (PSI) and Lighthouse is that PSI involves both real-world and lab data, while Lighthouse performs a page loading simulation by modifying the connection and user-agent of the device.
Another point of difference is that PSI doesn’t supply any information related to web accessibility, SEO, or progressive web apps (PWAs), while Lighthouse provides all of the above.
Thus, when we use PageSpeed Insights API for the bulk URL loading performance test, we won’t have any data for accessibility.
However, PSI provides more information related to the page speed performance, such as “DOM Size,” “Deepest DOM Child Element,” “Total Task Count,” and “DOM Content Loaded” timing.
One more advantage of the PageSpeed Insights API is that it gives the “observed metrics” and “actual metrics” different names.
In this guide, you will learn:
- How to create a production-level Python Script.
- How to use APIs with Python.
- How to construct data frames from API responses.
- How to analyze the API responses.
- How to parse URLs and process URL requests’ responses.
- How to store the API responses with proper structure.
An example output of the Page Speed Insights API call with Python is below.
Libraries For Using PageSpeed Insights API With Python
The necessary libraries to use PSI API with Python are below.
- Advertools retrieves testing URLs from the sitemap of a website.
- Pandas is to construct the data frame and flatten the JSON output of the API.
- Requests are to make a request to the specific API endpoint.
- JSON is to take the API response and put it into the specifically related dictionary point.
- Datetime is to modify the specific output file’s name with the date of the moment.
- URLlib is to parse the test subject website URL.
How To Use PSI API With Python?
To use the PSI API with Python, follow the steps below.
- Get a PageSpeed Insights API key.
- Import the necessary libraries.
- Parse the URL for the test subject website.
- Take the Date of Moment for file name.
- Take URLs into a list from a sitemap.
- Choose the metrics that you want from PSI API.
- Create a For Loop for taking the API Response for all URLs.
- Construct the data frame with chosen PSI API metrics.
- Output the results in the form of XLSX.
1. Get PageSpeed Insights API Key
Use the PageSpeed Insights API Documentation to get the API Key.
Click the “Get a Key” button below.
Choose a project that you have created in Google Developer Console.
Enable the PageSpeed Insights API on that specific project.
You will need to use the specific API Key in your API Requests.
2. Import The Necessary Libraries
Use the lines below to import the fundamental libraries.
import advertools as adv import pandas as pd import requests import json from datetime import datetime from urllib.parse import urlparse
3. Parse The URL For The Test Subject Website
To parse the URL of the subject website, use the code structure below.
domain = urlparse(sitemap_url) domain = domain.netloc.split(".")[1]
The “domain” variable is the parsed version of the sitemap URL.
The “netloc” represents the specific URL’s domain section. When we split it with the “.” it takes the “middle section” which represents the domain name.
Here, “0” is for “www,” “1” for “domain name,” and “2” is for “domain extension,” if we split it with “.”
4. Take The Date Of Moment For File Name
To take the date of the specific function call moment, use the “datetime.now” method.
Datetime.now provides the specific time of the specific moment. Use the “strftime” with the “%Y”, “”%m”, and “%d” values. “%Y” is for the year. The “%m” and “%d” are numeric values for the specific month and the day.
date = datetime.now().strftime("%Y_%m_%d")
5. Take URLs Into A List From A Sitemap
To take the URLs into a list form from a sitemap file, use the code block below.
sitemap = adv.sitemap_to_df(sitemap_url) sitemap_urls = sitemap["loc"].to_list()
If you read the Python Sitemap Health Audit, you can learn further information about the sitemaps.
6. Choose The Metrics That You Want From PSI API
To choose the PSI API response JSON properties, you should see the JSON file itself.
It is highly relevant to the reading, parsing, and flattening of JSON objects.
It is even related to Semantic SEO, thanks to the concept of “directed graph,” and “JSON-LD” structured data.
In this article, we won’t focus on examining the specific PSI API Response’s JSON hierarchies.
You can see the metrics that I have chosen to gather from PSI API. It is richer than the basic default output of PSI API, which only gives the Core Web Vitals Metrics, or Speed Index-Interaction to Next Paint, Time to First Byte, and First Contentful Paint.
Of course, it also gives “suggestions” by saying “Avoid Chaining Critical Requests,” but there is no need to put a sentence into a data frame.
In the future, these suggestions, or even every individual chain event, their KB and MS values can be taken into a single column with the name “psi_suggestions.”
For a start, you can check the metrics that I have chosen, and an important amount of them will be first for you.
PSI API Metrics, the first section is below.
fid = [] lcp = [] cls_ = [] url = [] fcp = [] performance_score = [] total_tasks = [] total_tasks_time = [] long_tasks = [] dom_size = [] maximum_dom_depth = [] maximum_child_element = [] observed_fcp = [] observed_fid = [] observed_lcp = [] observed_cls = [] observed_fp = [] observed_fmp = [] observed_dom_content_loaded = [] observed_speed_index = [] observed_total_blocking_time = [] observed_first_visual_change = [] observed_last_visual_change = [] observed_tti = [] observed_max_potential_fid = []
This section includes all the observed and simulated fundamental page speed metrics, along with some non-fundamental ones, like “DOM Content Loaded,” or “First Meaningful Paint.”
The second section of PSI Metrics focuses on possible byte and time savings from the unused code amount.
render_blocking_resources_ms_save = [] unused_javascript_ms_save = [] unused_javascript_byte_save = [] unused_css_rules_ms_save = [] unused_css_rules_bytes_save = []
A third section of the PSI metrics focuses on server response time, responsive image usage benefits, or not, using harms.
possible_server_response_time_saving = [] possible_responsive_image_ms_save = []
Note: Overall Performance Score comes from “performance_score.”
7. Create A For Loop For Taking The API Response For All URLs
The for loop is to take all of the URLs from the sitemap file and use the PSI API for all of them one by one. The for loop for PSI API automation has several sections.
The first section of the PSI API for loop starts with duplicate URL prevention.
In the sitemaps, you can see a URL that appears multiple times. This section prevents it.
for i in sitemap_urls[:9]: # Prevent the duplicate "/" trailing slash URL requests to override the information. if i.endswith("/"): r = requests.get(f"https://www.googleapis.com/pagespeedonline/v5/runPagespeed?url={i}&strategy=mobile&locale=en&key={api_key}") else: r = requests.get(f"https://www.googleapis.com/pagespeedonline/v5/runPagespeed?url={i}/&strategy=mobile&locale=en&key={api_key}")
Remember to check the “api_key” at the end of the endpoint for PageSpeed Insights API.
Check the status code. In the sitemaps, there might be non-200 status code URLs; these should be cleaned.
if r.status_code == 200: #print(r.json()) data_ = json.loads(r.text) url.append(i)
The next section appends the specific metrics to the specific dictionary that we have created before “_data.”
fcp.append(data_["loadingExperience"]["metrics"]["FIRST_CONTENTFUL_PAINT_MS"]["percentile"]) fid.append(data_["loadingExperience"]["metrics"]["FIRST_INPUT_DELAY_MS"]["percentile"]) lcp.append(data_["loadingExperience"]["metrics"]["LARGEST_CONTENTFUL_PAINT_MS"]["percentile"]) cls_.append(data_["loadingExperience"]["metrics"]["CUMULATIVE_LAYOUT_SHIFT_SCORE"]["percentile"]) performance_score.append(data_["lighthouseResult"]["categories"]["performance"]["score"] * 100)
Next section focuses on “total task” count, and DOM Size.
total_tasks.append(data_["lighthouseResult"]["audits"]["diagnostics"]["details"]["items"][0]["numTasks"]) total_tasks_time.append(data_["lighthouseResult"]["audits"]["diagnostics"]["details"]["items"][0]["totalTaskTime"]) long_tasks.append(data_["lighthouseResult"]["audits"]["diagnostics"]["details"]["items"][0]["numTasksOver50ms"]) dom_size.append(data_["lighthouseResult"]["audits"]["dom-size"]["details"]["items"][0]["value"])
The next section takes the “DOM Depth” and “Deepest DOM Element.”
maximum_dom_depth.append(data_["lighthouseResult"]["audits"]["dom-size"]["details"]["items"][1]["value"]) maximum_child_element.append(data_["lighthouseResult"]["audits"]["dom-size"]["details"]["items"][2]["value"])
The next section takes the specific observed test results during our Page Speed Insights API.
observed_dom_content_loaded.append(data_["lighthouseResult"]["audits"]["metrics"]["details"]["items"][0]["observedDomContentLoaded"]) observed_fid.append(data_["lighthouseResult"]["audits"]["metrics"]["details"]["items"][0]["observedDomContentLoaded"]) observed_lcp.append(data_["lighthouseResult"]["audits"]["metrics"]["details"]["items"][0]["largestContentfulPaint"]) observed_fcp.append(data_["lighthouseResult"]["audits"]["metrics"]["details"]["items"][0]["firstContentfulPaint"]) observed_cls.append(data_["lighthouseResult"]["audits"]["metrics"]["details"]["items"][0]["totalCumulativeLayoutShift"]) observed_speed_index.append(data_["lighthouseResult"]["audits"]["metrics"]["details"]["items"][0]["observedSpeedIndex"]) observed_total_blocking_time.append(data_["lighthouseResult"]["audits"]["metrics"]["details"]["items"][0]["totalBlockingTime"]) observed_fp.append(data_["lighthouseResult"]["audits"]["metrics"]["details"]["items"][0]["observedFirstPaint"]) observed_fmp.append(data_["lighthouseResult"]["audits"]["metrics"]["details"]["items"][0]["firstMeaningfulPaint"]) observed_first_visual_change.append(data_["lighthouseResult"]["audits"]["metrics"]["details"]["items"][0]["observedFirstVisualChange"]) observed_last_visual_change.append(data_["lighthouseResult"]["audits"]["metrics"]["details"]["items"][0]["observedLastVisualChange"]) observed_tti.append(data_["lighthouseResult"]["audits"]["metrics"]["details"]["items"][0]["interactive"]) observed_max_potential_fid.append(data_["lighthouseResult"]["audits"]["metrics"]["details"]["items"][0]["maxPotentialFID"])
The next section takes the Unused Code amount and the wasted bytes, in milliseconds along with the render-blocking resources.
render_blocking_resources_ms_save.append(data_["lighthouseResult"]["audits"]["render-blocking-resources"]["details"]["overallSavingsMs"]) unused_javascript_ms_save.append(data_["lighthouseResult"]["audits"]["unused-javascript"]["details"]["overallSavingsMs"]) unused_javascript_byte_save.append(data_["lighthouseResult"]["audits"]["unused-javascript"]["details"]["overallSavingsBytes"]) unused_css_rules_ms_save.append(data_["lighthouseResult"]["audits"]["unused-css-rules"]["details"]["overallSavingsMs"]) unused_css_rules_bytes_save.append(data_["lighthouseResult"]["audits"]["unused-css-rules"]["details"]["overallSavingsBytes"])
The next section is to provide responsive image benefits and server response timing.
possible_server_response_time_saving.append(data_["lighthouseResult"]["audits"]["server-response-time"]["details"]["overallSavingsMs"]) possible_responsive_image_ms_save.append(data_["lighthouseResult"]["audits"]["uses-responsive-images"]["details"]["overallSavingsMs"])
The next section is to make the function continue to work in case there is an error.
else: continue
Example Usage Of Page Speed Insights API With Python For Bulk Testing
To use the specific code blocks, put them into a Python function.
Run the script, and you will get 29 page speed-related metrics in the columns below.
Conclusion
PageSpeed Insights API provides different types of page loading performance metrics.
It demonstrates how Google engineers perceive the concept of page loading performance, and possibly use these metrics as a ranking, UX, and quality-understanding point of view.
Using Python for bulk page speed tests gives you a snapshot of the entire website to help analyze the possible user experience, crawl efficiency, conversion rate, and ranking improvements.
More resources:
Featured Image: Dundanim/Shutterstock
SEO
Google Declares It The “Gemini Era” As Revenue Grows 15%
Alphabet Inc., Google’s parent company, announced its first quarter 2024 financial results today.
While Google reported double-digit growth in key revenue areas, the focus was on its AI developments, dubbed the “Gemini era” by CEO Sundar Pichai.
The Numbers: 15% Revenue Growth, Operating Margins Expand
Alphabet reported Q1 revenues of $80.5 billion, a 15% increase year-over-year, exceeding Wall Street’s projections.
Net income was $23.7 billion, with diluted earnings per share of $1.89. Operating margins expanded to 32%, up from 25% in the prior year.
Ruth Porat, Alphabet’s President and CFO, stated:
“Our strong financial results reflect revenue strength across the company and ongoing efforts to durably reengineer our cost base.”
Google’s core advertising units, such as Search and YouTube, drove growth. Google advertising revenues hit $61.7 billion for the quarter.
The Cloud division also maintained momentum, with revenues of $9.6 billion, up 28% year-over-year.
Pichai highlighted that YouTube and Cloud are expected to exit 2024 at a combined $100 billion annual revenue run rate.
Generative AI Integration in Search
Google experimented with AI-powered features in Search Labs before recently introducing AI overviews into the main search results page.
Regarding the gradual rollout, Pichai states:
“We are being measured in how we do this, focusing on areas where gen AI can improve the Search experience, while also prioritizing traffic to websites and merchants.”
Pichai reports that Google’s generative AI features have answered over a billion queries already:
“We’ve already served billions of queries with our generative AI features. It’s enabling people to access new information, to ask questions in new ways, and to ask more complex questions.”
Google reports increased Search usage and user satisfaction among those interacting with the new AI overview results.
The company also highlighted its “Circle to Search” feature on Android, which allows users to circle objects on their screen or in videos to get instant AI-powered answers via Google Lens.
Reorganizing For The “Gemini Era”
As part of the AI roadmap, Alphabet is consolidating all teams building AI models under the Google DeepMind umbrella.
Pichai revealed that, through hardware and software improvements, the company has reduced machine costs associated with its generative AI search results by 80% over the past year.
He states:
“Our data centers are some of the most high-performing, secure, reliable and efficient in the world. We’ve developed new AI models and algorithms that are more than one hundred times more efficient than they were 18 months ago.
How Will Google Make Money With AI?
Alphabet sees opportunities to monetize AI through its advertising products, Cloud offerings, and subscription services.
Google is integrating Gemini into ad products like Performance Max. The company’s Cloud division is bringing “the best of Google AI” to enterprise customers worldwide.
Google One, the company’s subscription service, surpassed 100 million paid subscribers in Q1 and introduced a new premium plan featuring advanced generative AI capabilities powered by Gemini models.
Future Outlook
Pichai outlined six key advantages positioning Alphabet to lead the “next wave of AI innovation”:
- Research leadership in AI breakthroughs like the multimodal Gemini model
- Robust AI infrastructure and custom TPU chips
- Integrating generative AI into Search to enhance the user experience
- A global product footprint reaching billions
- Streamlined teams and improved execution velocity
- Multiple revenue streams to monetize AI through advertising and cloud
With upcoming events like Google I/O and Google Marketing Live, the company is expected to share further updates on its AI initiatives and product roadmap.
Featured Image: Sergei Elagin/Shutterstock
SEO
brightonSEO Live Blog
Hello everyone. It’s April again, so I’m back in Brighton for another two days of Being the introvert I am, my idea of fun isn’t hanging around our booth all day explaining we’ve run out of t-shirts (seriously, you need to be fast if you want swag!). So I decided to do something useful and live-blog the event instead.
Follow below for talk takeaways and (very) mildly humorous commentary. sun, sea, and SEO!
SEO
Google Further Postpones Third-Party Cookie Deprecation In Chrome
Google has again delayed its plan to phase out third-party cookies in the Chrome web browser. The latest postponement comes after ongoing challenges in reconciling feedback from industry stakeholders and regulators.
The announcement was made in Google and the UK’s Competition and Markets Authority (CMA) joint quarterly report on the Privacy Sandbox initiative, scheduled for release on April 26.
Chrome’s Third-Party Cookie Phaseout Pushed To 2025
Google states it “will not complete third-party cookie deprecation during the second half of Q4” this year as planned.
Instead, the tech giant aims to begin deprecating third-party cookies in Chrome “starting early next year,” assuming an agreement can be reached with the CMA and the UK’s Information Commissioner’s Office (ICO).
The statement reads:
“We recognize that there are ongoing challenges related to reconciling divergent feedback from the industry, regulators and developers, and will continue to engage closely with the entire ecosystem. It’s also critical that the CMA has sufficient time to review all evidence, including results from industry tests, which the CMA has asked market participants to provide by the end of June.”
Continued Engagement With Regulators
Google reiterated its commitment to “engaging closely with the CMA and ICO” throughout the process and hopes to conclude discussions this year.
This marks the third delay to Google’s plan to deprecate third-party cookies, initially aiming for a Q3 2023 phaseout before pushing it back to late 2024.
The postponements reflect the challenges in transitioning away from cross-site user tracking while balancing privacy and advertiser interests.
Transition Period & Impact
In January, Chrome began restricting third-party cookie access for 1% of users globally. This percentage was expected to gradually increase until 100% of users were covered by Q3 2024.
However, the latest delay gives websites and services more time to migrate away from third-party cookie dependencies through Google’s limited “deprecation trials” program.
The trials offer temporary cookie access extensions until December 27, 2024, for non-advertising use cases that can demonstrate direct user impact and functional breakage.
While easing the transition, the trials have strict eligibility rules. Advertising-related services are ineligible, and origins matching known ad-related domains are rejected.
Google states the program aims to address functional issues rather than relieve general data collection inconveniences.
Publisher & Advertiser Implications
The repeated delays highlight the potential disruption for digital publishers and advertisers relying on third-party cookie tracking.
Industry groups have raised concerns that restricting cross-site tracking could push websites toward more opaque privacy-invasive practices.
However, privacy advocates view the phaseout as crucial in preventing covert user profiling across the web.
With the latest postponement, all parties have more time to prepare for the eventual loss of third-party cookies and adopt Google’s proposed Privacy Sandbox APIs as replacements.
Featured Image: Novikov Aleksey/Shutterstock
-
PPC7 days ago
19 Best SEO Tools in 2024 (For Every Use Case)
-
SEARCHENGINES6 days ago
Daily Search Forum Recap: April 19, 2024
-
WORDPRESS7 days ago
How to Make $5000 of Passive Income Every Month in WordPress
-
MARKETING6 days ago
Battling for Attention in the 2024 Election Year Media Frenzy
-
WORDPRESS5 days ago
13 Best HubSpot Alternatives for 2024 (Free + Paid)
-
SEO7 days ago
25 WordPress Alternatives Best For SEO
-
WORDPRESS6 days ago
7 Best WooCommerce Points and Rewards Plugins (Free & Paid)
-
MARKETING7 days ago
Tinuiti Marketing Analytics Recognized by Forrester
You must be logged in to post a comment Login