20 accounts per proxy is way too much these days and you don’t need 4g for scrapers, try regular proxies they are much cheaper and you can use 2-3 scrapers per proxy, and when they die you can easily replace them, and the operation will be much cheaper.
you mean proxy cheap like 1$/proxy? and possible to use for 2-3 scraper? and how about when they died? do i need to replace the proxy too? or just replace the acc scraper with same proxy?
let say if i use cheap proxy. how long scraper can survive?
yes, cheap proxies and cheap scrapers if you can get the scrapers to do the job in 48-72 hours it would be good when they when they die just replace them you can do that 3-4 times on the proxy then change the proxy again.
you mean.if scraper doing great in 48-72hour. thats good for the proxy?
but if they Died just replace the acc scraper with same proxy until 3times? after 3times, change with new proxy?
may i know how long it can surfife for scraper acc using cheap proxy? 1month?
i really dont know why…our scraper getting many HITS running at pm-am almost 95% get DISABLE…
The acc will not execute any act between.
10:30 and 22:00
any wrong with my settings?
If you are running 20 scrapers per main you should set higher delays and lower api calls. Start very low and add until the account can get full actions each day without scrapers dying. Either that or the scrapers are just bad.
Tip: leave the scraper on a proxy doing nothing for a few days and see if it’s still alive
Tip: leave the scraper on a proxy doing nothing for a few days and see if it’s still alive
you mean. after valid. i need to stop in 2days…and check again if its still valid.then its a good acc? and if not. its a bad acc?
higher delays and lower api calls. Enable API limits:
Wait between 2500 and 12000 milliseconds when api calls exceed 2 in the last 5 seconds OR
Wait between 2500 and 7500 milliseconds when api calls exceed 1 in the last 2 seconds.
Delay all tools when API errors exceed a given value:
Wait between 60 and 120 minutes when api errors exceed 1 per hour and
Wait between 350 and 550 minutes when api errors exceed 2 per day.
Delay all tools when API calls exceed a given value:
Wait between 10 and 30 minutes when api calls exceed 50 per hour and
Wait between 350 and 550 minutes when api calls exceed 500 per day.
this not enough?
and how about our nightmode set? its correct?
if a scaper can do a good job for 3 days or even 4 days that would be good, after those 3-4 days if the scraper dies you can just replace it buy another scraper, you can do that operation (scraper-- 3days --replace) for 3-4 times then change everything, proxies and scrapers, it will cost you cheaper.
I believe what he means is reduce the number of these number of API Calls (50 per hour and 500 per day). You have many scraper accounts per 1 main account so, even though a scraper account’s status switches to delay quickly, you still have other valid scraper accounts which will do API Calls. It should be safer this way.
Wait between 10 and 30 minutes when api calls exceed 40 per hour and
Wait between 350 and 550 minutes when api calls exceed 350 per day.
thanks…how about the nightmode setting? for main and scraper acc Enable Nightmode
The account will not execute any actions between acc 1-10 : 10:30 and 22:00 acc 11-20 : 23:00 and 9:30