Scraping Central is reader-supported. When you buy through links on our site, we may earn an affiliate commission.

3.25intermediate4 min read

Local Pack, Map Results, and the Local SEO World

Three businesses with stars and a map. Behind it: a separate Google Maps API, GMB profiles, and a whole industry.

What you’ll learn

  • Identify the Local Pack and Map results on a SERP.
  • Parse local-pack entries (name, address, phone, rating, hours).
  • Understand how the Map embed differs from the local pack.
  • Use Catalog108's /locations XHR to practice local-pack-style scraping.

For any query with local intent, "pizza near me," "dentist Chicago," "stores near 90210", Google injects the Local Pack: 3 nearby businesses with a map. Behind it is Google Maps, Google Business Profile (formerly Google My Business / GMB), and a parallel ranking system distinct from organic SEO.

For a scraper, this is one of the most valuable SERP blocks. It carries phone numbers, addresses, hours, ratings, directly actionable data.

What the Local Pack contains

Visually: three business cards above the regular organic results, each with name, rating, distance, brief metadata, and a map showing their locations.

Sample SERP-API shape:

{
  "local_results": {
  "places": [
  {
  "position": 1,
  "title": "Mario's Pizza",
  "place_id": "ChIJ...",
  "rating": 4.6,
  "reviews": 1234,
  "price": "$$",
  "type": "Pizza restaurant",
  "address": "123 Main St, Chicago, IL 60601",
  "phone": "(312) 555-0100",
  "website": "https://...",
  "hours": "Open ⋅ Closes 11 PM",
  "service_options": {"dine_in": true, "takeout": true, "delivery": false},
  "gps_coordinates": {"latitude": 41.881832, "longitude": -87.623177},
  "thumbnail": "..."
  }...
  ],
  "map_url": "https://www.google.com/maps/search/..."
  }
}

Key fields:

  • place_id, Google's permanent ID for the business. Stable across queries.
  • gps_coordinates, useful for plotting / geo-analysis.
  • service_options, dine-in, takeout, delivery, curbside.
  • hours, current-day text status. For full hour schedules, hit the Place Details API or scrape the Place page.

Local Pack vs Map Results

Two distinct things:

  • Local Pack = the 3 businesses on the main SERP. Cheaper to scrape; surfaced for every local query.
  • Map Results = the full Google Maps listing when you click "More places" or visit google.com/maps?q=.... Returns 20–60 places, sorted by relevance + distance + rating.

Many SERP-APIs support both. Use Local Pack for SEO/SERP analysis ("does my business rank for X near Y?"); use Map Results for full-market lead generation ("all pizza places within 5 miles of 60601").

Catalog108's local-pack mock

practice.scrapingcentral.com/search?q=stores+near+me returns a SERP with a local-pack-style block. The data comes from /api/locations:

import requests

r = requests.get("https://practice.scrapingcentral.com/api/locations", params={"q": "stores near me"})
locations = r.json()

for loc in locations.get("places", []):
  print(f"{loc['name']} ({loc['rating']}★), {loc['address']}")

Catalog108 ships its locations as JSON for easy practice. Real Google requires a SERP-API.

The Google Maps XHR (for the local pack itself)

When you visit Google Maps in a browser, the page makes XHR calls to internal endpoints, extremely complex protobuf-encoded responses. Scraping it directly is hard:

  • Endpoints like https://www.google.com/maps/preview/place returning protobuf.
  • Token systems and per-session cookies.
  • Heavy anti-bot.

SERP-APIs do this work for you. Lesson 3.30 covers why direct Maps scraping is impractical.

A local-lead-generation use case

Suppose you want all dentists within 10 miles of every ZIP code in a state:

def fetch_dentists_in_zip(zipcode, api_key):
  r = requests.get("https://serp-api.example.com/search", params={
  "q": "dentist",
  "engine": "google_maps",
  "location": zipcode,
  "type": "search",
  "api_key": api_key,
  })
  return r.json().get("local_results", [])

all_dentists = []
for zipcode in load_state_zipcodes("CA"):
  all_dentists.extend(fetch_dentists_in_zip(zipcode, api_key))

# Dedupe by place_id (the same dentist near multiple ZIPs)
seen = set()
deduped = []
for d in all_dentists:
  pid = d.get("place_id")
  if pid and pid not in seen:
  seen.add(pid)
  deduped.append(d)

print(f"Found {len(deduped)} unique dentists in CA.")

PHP equivalent: same shape with Guzzle + array_unique on place_id.

place_id is critical for de-dup, the same business surfaces for multiple nearby ZIPs.

Local SEO context

The local pack ranks by a different signal blend than organic:

  • Proximity, how close to the searcher's location.
  • Relevance, does the GMB category match the query.
  • Prominence, review count, review velocity, citations across the web.

If you're tracking local rankings, your scraper must:

  • Specify a precise location (latitude/longitude, not just "Chicago").
  • Track over time (proximity changes with map rendering edge cases).
  • Capture both "rank 1 in local pack" and "rank N in map results", they're different KPIs.

Parsing hours

Hours from a single SERP call is just "Closes 11 PM today." For the full weekly schedule, you need:

  • Google Place Details API (paid, requires Google Cloud).
  • A SERP-API's place details endpoint (most SERP-APIs offer one).
  • Scraping the place's Google Maps page (hard, see above).

For most use cases, the SERP-API's place details call is the right balance.

A local-pack parser

def parse_local_pack(data: dict) -> list[dict]:
  return [
  {
  "place_id": p.get("place_id"),
  "name": p.get("title"),
  "rating": p.get("rating"),
  "review_count": p.get("reviews"),
  "address": p.get("address"),
  "phone": p.get("phone"),
  "website": p.get("website"),
  "lat": p.get("gps_coordinates", {}).get("latitude"),
  "lng": p.get("gps_coordinates", {}).get("longitude"),
  "rank": p.get("position"),
  "category": p.get("type"),
  }
  for p in (data.get("local_results", {}) or {}).get("places", [])
  ]

Hands-on lab

Open practice.scrapingcentral.com/search?q=stores+near+me. Note the local-pack-style block. Fetch /api/locations?q=stores+near+me and parse each location into a dict with name, address, phone, rating, lat/lng. Then think about: if you wanted dentists across an entire state, how would you de-dup across overlapping searches? (Answer: by place_id.)

Hands-on lab

Practice this lesson on Catalog108, our first-party scraping sandbox.

Open lab target → /search?q=stores+near+me

Quiz, check your understanding

Pass mark is 70%. Pick the best answer; you’ll see the explanation right after.

Local Pack, Map Results, and the Local SEO World1 / 8

What is `place_id` in a Google Maps / Local Pack context?

Score so far: 0 / 0