在线时间:8:00-16:00
迪恩网络APP
随时随地掌握行业动态
扫描二维码
关注迪恩网络微信公众号
开源软件名称(OpenSource Name):drobnikj/crawler-google-places开源软件地址(OpenSource Url):https://github.com/drobnikj/crawler-google-places开源编程语言(OpenSource Language):JavaScript 99.1%开源软件介绍(OpenSource Introduction):FeaturesThis Google Maps crawler will enable you to get more and faster data from Google Places than the official Google Places API. To understand how to configure the scraper and get ideas on how you can use the data you can extract, watch a short video tutorial on YouTube or follow our step-by-step guide on how to scrape Google Maps. Our unofficial Google Maps API enables you to extract all of the following data from Google Maps:
The scraper also supports the scraping of all detailed information about reviews:
Personal data extraction about reviewers has to be explicitly enabled in input (see Personal data section):
The Google Maps Scraper also provides other very useful features:
Advantages over Google Maps APIThe official Google Places API (it's still called the old way) is an adequate option for many use cases, but this unofficial Google Maps API provides more cost-effective, comprehensive results, and also scrapes histograms for popular times, which aren't available in the official API. While you are no longer limited to a maximum number of requests per day with the Google Places API, there are still rate limits and quotas that apply. Our Google Maps API enforces no such rate limits or quotas. How much will it cost?As a rule, getting results with Google Maps Scraper will not consume a lot of your platform credits. But this depends heavily on the complexity of your search. For more details about platform credits and usage, see the Cost of usage tab. Google Maps scraping tutorialFor a simple explanation of how to scrape Google Maps, follow a step-by-step tutorial on our blog or see our short YouTube video. Why scrape Google Maps?
For more ideas on how to use the extracted data, check out our industries pages for concrete ways web scraping results are already being used across the projects and businesses of various scale and direction - in travel and logistics, for instance. InputWhen running the Google Maps Scraper, you need to configure what you want to scrape and how it should be scraped. This input is provided either as a JSON file or in the editor on the Apify platform. Most input fields have reasonable default values. Input example{
"searchStringsArray": ["pubs"],
"city": "Prague"
} For detailed descriptions and examples for all input fields, please visit the dedicated Input page. OutputThe scraped data is stored in the dataset of each run. The data can be viewed or downloaded in many popular formats, such as JSON, CSV, Excel, XML, RSS, and HTML. Output exampleThe result for scraping a single Google Place looks like this (shortened to only the first two pubs for viewing convenience): {
"title": "Fat Cat Beerhouse & Restaurant",
"subTitle": null,
"price": "$$",
"menu": "fat-cat.cz",
"categoryName": "Restaurant",
"address": "Karlova 44, 110 00 Staré Město, Czechia",
"locatedIn": null,
"neighborhood": "Karlova 44",
"street": "Karlova 44",
"city": "Old Town",
"postalCode": "110 00",
"state": null,
"countryCode": "CZ",
"plusCode": "3CP9+CG Prague, Czechia",
"website": "https://www.fat-cat.cz/",
"phone": "+420 735 751 751",
"temporarilyClosed": false,
"permanentlyClosed": false,
"totalScore": 4.4,
"isAdvertisement": false,
"rank": 47,
"placeId": "ChIJxT1C1u6UC0cRjNBm7b6wDjM",
"categories": [
"Restaurant"
],
"cid": "3679072279681486988",
"url": "https://www.google.com/maps/place/Fat+Cat+Beerhouse+%26+Restaurant/@50.0860664,14.416575,17z/data=!3m1!4b1!4m5!3m4!1s0x470b94eed6423dc5:0x330eb0beed66d08c!8m2!3d50.0860664!4d14.4187637?hl=en",
"searchPageUrl": "https://www.google.com/maps/search/takeout/@50.0852853,14.4123976,1225m/data=!3m1!1e3!4m4!2m3!5m1!15shas_takeout!6e5!5m1!1e4?hl=en",
"searchString": null,
"location": {
"lat": 50.0860664,
"lng": 14.4187637
},
"scrapedAt": "2022-01-20T16:28:59.701Z",
"reviewsCount": 3609,
"reviewsDistribution": {
"oneStar": 94,
"twoStar": 68,
"threeStar": 315,
"fourStar": 829,
"fiveStar": 2303
},
"imageUrls": [
"https://lh5.googleusercontent.com/p/AF1QipPkgSf8TR98sGzI2RnJKvRyzfoyYkTrRUugYWKj=w1920-h1080-k-no"
],
"reviews": [],
"orderBy": [
{
"name": "restu.cz",
"url": "http://restu.cz/fat-cat-praha/"
}
]
},
{
"title": "At The Old Lady",
"subTitle": null,
"price": "$$",
"menu": null,
"categoryName": "Restaurant",
"address": "9, Michalská 441, 110 00 Hlavní město, Czechia",
"locatedIn": null,
"neighborhood": "9, Michalská 441",
"street": "9, Michalská 441",
"city": "Hlavní město",
"postalCode": "110 00",
"state": null,
"countryCode": "CZ",
"plusCode": "3CM9+XX Prague, Czechia",
"website": "http://www.hotelustarepani.cz/",
"phone": "+420 589 127 964",
"temporarilyClosed": false,
"permanentlyClosed": false,
"totalScore": 4.5,
"isAdvertisement": false,
"rank": 64,
"placeId": "ChIJORAjle6UC0cR697RocpZFnU",
"categories": [
"Restaurant"
],
"cid": "8437029678758354667",
"url": "https://www.google.com/maps/place/At+The+Old+Lady/@50.0849313,14.4176865,17z/data=!3m1!4b1!4m5!3m4!1s0x470b94ee95231039:0x751659caa1d1deeb!8m2!3d50.0849313!4d14.4198755?hl=en",
"searchPageUrl": "https://www.google.com/maps/search/takeout/@50.0852853,14.4123976,1225m/data=!3m1!1e3!4m4!2m3!5m1!15shas_takeout!6e5!5m1!1e4?hl=en",
"searchString": null,
"location": {
"lat": 50.0849313,
"lng": 14.4198755
},
"scrapedAt": "2022-01-20T16:29:08.907Z",
"reviewsCount": 39,
"reviewsDistribution": {
"oneStar": 3,
"twoStar": 0,
"threeStar": 1,
"fourStar": 5,
"fiveStar": 30
},
"imageUrls": [
"https://lh5.googleusercontent.com/proxy/DqAYJYgsu9aQCdbKvnrdwONTW60qn_Dyhz2re-pmtj7xf2MN8bfU2qVc70OUyqmaiUZ54aBqXuhNeX8LO4F0RCiagkT2M-kjhCLa-EP07Nb9PR5LtFdO-hJ_JLk8PXsKEMMM9chiedjk2_achCni-V4e8fGj060=w1920-h1080-k-no"
],
"reviews": [],
"orderBy": []
} Adjusting output formatThe Apify platform allows you to choose from many dataset formats, but also to restructure the output itself. One review per rowNormally, each result item contains data about a single place. Each item is displayed as one row in tabulated formats. There is a lot of data about each place, so the tabulated formats get very messy and hard to analyze. Fortunately, there is a solution. You can tick on the For example, if you need to analyze reviews, you can configure the download to only contain the data you need and adjust the row/column format. Here's how to get a list of reviews with a place title one review per row: copy the download link in the format you need, paste it to a different tab, and add The whole download link for, e.g. CSV would look like this (with dataset ID):https://api.apify.com/v2/datasets/DATASET_ID/items?clean=true&format=csv&attachment=true&unwind=reviews&fields=reviews,title Gas pricesIf business place is gas station then gas prices parsed when available. Normally to see gas prices in browser search must include "Gas" category, for example https://www.google.com/maps/search/Gas/@33.4260879,-87.6703234,8z/data=!3m1!4b1?hl=en However actor will get prices for any gas station found by any other search or specified by direct URL. Please note that prices currently expected only from USA and not every gas station provides prices, if available you will get output as follows: "gasPrices": [
{
"priceTag": "$4.90",
"updatedAt": "2022-05-13T03:47:16.000Z",
"unit": "gallon",
"currency": "USD",
"price": 4.9,
"gasType": "Premium"
}
] Usage on Apify platform and locallyIf you want to run the actor on the Apify platform, you may need to use some proxy IP addresses. You can use your free Apify Proxy trial or you can subscribe to one of Apify's subscription plans. Running locally or on a different platformYou can easily run this scraper locally or on your favorite platform. It can run as a simple Node.js process or inside a Docker container. How the search worksIt works exactly as though you were searching Google Maps on your computer. It opens https://www.google.com/maps/ and relocates to the specified location, then writes the search to the input. Then it presses the next page button until it reaches the final page or Using country, state, county, city, and postal code parametersYou can use any combination of the geolocation parameters: Automatic zoomingThe scraper automatically zooms the map to ensure maximum results are extracted. Higher
If you need even more results or faster run, you can override these values with the Custom GeolocationThe easiest way to use our Google Maps Scraper is to provide There are several types of geolocation geometry that you can use. All follow official Geo Json RFC. PolygonThe most common type is polygon which is a set of points that define the location. The first and last coordinate must be equal (to close the polygon)!!! {
"type": "Polygon",
"coordinates": [
[
[
// Must be the same as last one
0.0686389, // Longitude
52.2161086 // Latitude
],
[
0.1046861,
52.1906436
],
[
0.0981038,
52.1805451
],
[
0.1078243,
52.16831
],
[
// Must be the same as first one
0.0686389,
52.2161086
]
// ...
]
]
}
MultiPolygonMulti polygon can combine more polygons that are not continuous together. {
"type": "MultiPolygon",
"coordinates": [
[ // first polygon
[
[
12.0905752, // Longitude
50.2524063 // Latitude
],
[
12.1269337,
50.2324336
],
// ...
]
],
[
// second polygon
// ...
]
]
}
CircleFor a circle, we can use the {
"type": "Point",
"coordinates": ["7.5503", "47.5590"],
"radiusKm": 1
} Personal dataReviews can contain personal data such as a name, profile image, and even a review ID that could be used to track down the reviewer. Personal data is protected by GDPR in the European Union and by other regulations around the world. You should not scrape personal data unless you have a legitimate reason to do so. If you're unsure whether your reason is legitimate, consult your lawyers. This scraper allows you to granularly select which personal data fields you want to extract from reviews and which not. You can read the basics of ethical web scraping in our blogpost on the legality of web scraping. ChangelogThis scraper is under active development. We are always implementing new features and fixing bugs. If you would like to see a new feature, please submit an issue on GitHub. Check CHANGELOG.md for a list of recent updates. ContributionsWe're always pleased to see issues or pull requests created by the community. Special thanks to: mattiashtd zzbazza |
2023-10-27
2022-08-15
2022-08-17
2022-09-23
2022-08-13
请发表评论