Location & Language Targeting (`gl`, `hl`, `location` Parameters)
The single most-misunderstood parameter trio in SERP scraping. Get them right, get accurate geo-localized data.
What you’ll learn
- Distinguish `gl`, `hl`, and `location` parameters.
- Pick the right combination for a given geo-targeting need.
- Understand precision tiers: country / city / lat-lng.
- Test for geo-targeting fidelity in your SERP-API.
If you scrape SERPs for a multi-region SEO operation, the wrong gl/hl/location combination is the single most common reason your data is silently wrong. Same query, US data returned, but you needed Germany. Looks fine in your dashboard. Useless to the German SEO team.
This lesson is the parameter primer.
The three parameters
| Param | Controls | Example values |
|---|---|---|
gl |
Country of search. Which Google index to use. | us, uk, de, jp, br |
hl |
Interface language. What language Google's UI (and some result snippets) appear in. | en, de, ja, es |
location |
City / region targeting within the country. Precise local pack. | New York,NY,United States, Munich,Germany |
Plus, for high precision:
uule(Google internal), base64-encoded precise location string.lat/lng, some SERP-APIs accept GPS coordinates directly.
Common combinations
| Use case | gl | hl | location |
|---|---|---|---|
| US English nationwide | us |
en |
(optional, just a city if needed) |
| UK English nationwide | uk |
en |
London,England,United Kingdom |
| Germany in German | de |
de |
Berlin,Germany |
| Germany in English | de |
en |
Berlin,Germany (e.g. for English-speaking expat audience) |
| India in Hindi | in |
hi |
Mumbai,Maharashtra,India |
| Brazil in Portuguese | br |
pt-BR |
São Paulo,SP,Brazil |
gl controls which Google you hit. hl controls what language you read it in. They're independent.
Catalog108 example
practice.scrapingcentral.com/search?gl=in&hl=en returns a mock SERP shaped for India in English. The query and result set may differ between, say, gl=us&hl=en and gl=in&hl=en, practice the parameter handling here before pointing at a real SERP-API.
SERP-API usage
import requests
def search(q, gl="us", hl="en", location=None):
params = {
"q": q,
"engine": "google",
"gl": gl,
"hl": hl,
"api_key": "...",
}
if location:
params["location"] = location
return requests.get("https://api.example-serp.com/search", params=params).json()
# Same query, three audiences
us = search("hosting providers", gl="us", hl="en", location="Austin,Texas,United States")
uk = search("hosting providers", gl="uk", hl="en", location="London,England,United Kingdom")
in_ = search("hosting providers", gl="in", hl="en", location="Mumbai,Maharashtra,India")
for label, data in [("US", us), ("UK", uk), ("IN", in_)]:
titles = [r["title"] for r in data.get("organic_results", [])[:5]]
print(f"{label}: {titles}")
You'll see substantially different top-5 lists. Local hosting brands win in their home country; multinationals show up in multiple.
Precision tiers
Three tiers, increasing in precision and cost (some providers charge more for finer location targeting):
- Country only (
gl=us): cheapest, broadest. Returns results as if a generic US user searched. - City (
location=Austin,Texas,United States): mid-precision. Local pack reflects the city. - Lat-lng (
lat=30.27&lng=-97.74): highest precision. Most accurate local pack.
For most SEO use cases, city-level is the right balance. Lat-lng matters for hyper-local (a specific neighborhood, a specific intersection).
The uule parameter
Internally, Google uses a base64-encoded location string called uule. Some SERP-APIs accept it directly:
import base64
def uule(location_string: str) -> str:
# Google's format: "w+CAIQICI<base64 of length-prefixed location>"
raw = location_string.encode("utf-8")
payload = bytes([len(raw)]) + raw
encoded = base64.urlsafe_b64encode(payload).decode().rstrip("=")
return f"w+CAIQICI{encoded}"
print(uule("Austin, Texas, United States"))
Most users don't need to construct uule manually, pass location and let the SERP-API handle the encoding.
Testing for geo-fidelity
Before trusting a SERP-API's geo-targeting:
- Run a query with strong local intent (
weather,pizza near me) in two distinct cities. - Compare local pack results, they should be totally different.
- If they're identical, the geo-targeting isn't actually working.
nyc = search("pizza near me", location="New York,New York,United States")
la = search("pizza near me", location="Los Angeles,California,United States")
nyc_places = [p["title"] for p in nyc["local_results"]["places"]]
la_places = [p["title"] for p in la["local_results"]["places"]]
print("NYC:", nyc_places)
print("LA:", la_places)
assert set(nyc_places).isdisjoint(la_places), "Geo-targeting suspicious"
Common pitfalls
-
Forgetting
hl. Returns the right country but the wrong-language UI. For German rank tracking, you wanthl=de, nothl=en. -
Using only
gl. Country-level is too coarse for local rank tracking. Addlocation. -
Misspelling location.
"new york,ny,usa"may fail;"New York,New York,United States"works. Use the exact format the provider expects (often comma-separated, full names). -
Assuming free providers support fine location. Some only support country. Read the docs.
-
Caching across locations. Cache by
(q, gl, hl, location), not justq. Otherwise the second query reuses the first location's results.
A multi-locale tracking workflow
LOCALES = [
("us", "en", "Austin,Texas,United States"),
("uk", "en", "Manchester,England,United Kingdom"),
("de", "de", "Berlin,Germany"),
("in", "en", "Mumbai,Maharashtra,India"),
("br", "pt-BR", "São Paulo,SP,Brazil"),
]
def track_rank(q, my_domain):
out = {}
for gl, hl, location in LOCALES:
data = search(q, gl, hl, location)
org = data.get("organic_results", [])
rank = next(
(i + 1 for i, r in enumerate(org) if my_domain in r.get("link", "")),
None
)
out[f"{gl}-{hl}-{location}"] = rank
return out
Run nightly, persist, plot trends by locale. That's a 2026 international SEO dashboard.
Hands-on lab
If you have a SERP-API trial: query a global brand name (e.g. "Adidas") with gl=us,hl=en, gl=de,hl=de, gl=jp,hl=ja. Note which sub-domain of the brand ranks first in each. On Catalog108: hit /search?gl=in&hl=en and /search?gl=us&hl=en and inspect the returned data, note how the platform responds to different geo combinations. Practice writing the same parser that handles all locales generically.
Hands-on lab
Practice this lesson on Catalog108, our first-party scraping sandbox.
Open lab target →/search?gl=in&hl=enQuiz, check your understanding
Pass mark is 70%. Pick the best answer; you’ll see the explanation right after.