Scraper/Slave Tipps?

Do you have any tips for the scraper settings? I was thinking about using the option to do some random actions every here and then, since I am currently only using the scrapers to scrape. I know the key is to act as human as possible.

I would say, resting them each other day could help a lot. Use one group of scrapers one day, tomorrow go with another batch of scrapers and rotate them that way.

You can start with making the account look as natural as possible (with a profile photo, some posts, detailed bio, and some activities). After that, make sure you have set the API Limits and Delays as this has helped a lot of users, and to use the option ‘scrape with the EB where possible’ on the scraper’s Advanced Profile Settings.

1 Like

What API limits? I currently got 1 error hour then pause, 5 errors day then pause.

yes, you can try that and see, it might work fine or not, but the only one that can answer that is you by testing on 3-4 accounts and see how it goes.

So do you suggest to only start with 3-4. Is that really enough to give some credential data? I usally go with at least 50 accounts. Best

well I might say 10-15 but I really don’t know your testing budget, but yeah you can use 10-15 accounts for testing should be good as a starter.