Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

build docker image #21

Closed
wants to merge 14 commits into from
28 changes: 28 additions & 0 deletions examples/baidu.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,28 @@
{
"user_agent":"Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/70.0.3538.110 Safari/537.36",
"random_user_agent":true,
"sleep_range":"",
"search_engine":"baidu",
"debug":true,
"verbose":true,
"keywords":[ "cat", "mouse" ],
"keyword_file":"",
"num_pages":1,
"headless":true,
"chrome_flags":[ ],
"output_file":"examples/results/baidu.json",
"block_assets":false,
"custom_func":"",
"proxy":"",
"proxy_file":"",
"test_evasion":false,
"apply_evasion_techniques":true,
"log_ip_address":false,
"log_http_headers":false,
"puppeteer_cluster_config":{
"timeout":600000,
"monitor":false,
"concurrency":1,
"maxConcurrency":1
}
}
28 changes: 28 additions & 0 deletions examples/google.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,28 @@
{
"user_agent":"Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/70.0.3538.110 Safari/537.36",
"random_user_agent":true,
"sleep_range":"[1,2]",
"search_engine":"google",
"debug":true,
"verbose":true,
"keywords":["scrapeulous.com", "scraping search engines", "scraping service scrapeulous", "learn js"],
"keyword_file":"",
"num_pages":2,
"headless":true,
"chrome_flags":[ ],
"output_file":"examples/results/google.json",
"block_assets":false,
"custom_func":"",
"proxy":"http://proxy:24000",

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If I have an account and password, after my test, the request failed.

"proxy_file":"",
"test_evasion":false,
"apply_evasion_techniques":true,
"log_ip_address":true,
"log_http_headers":true,
"puppeteer_cluster_config":{
"timeout":600000,
"monitor":false,
"concurrency":1,
"maxConcurrency":2
}
}