So this has been puzzling me for a long time. I thought I would outline what my experience has been and see what was yours, as it just insane lol.
TL:DR classic scraper method got me 50 users per scraper daily and the tagname method gets me less than 5, wtf? lol
Earlier this year when I just got back into Jarvee I was running 40 slaves and 30 scrapers, using classic method to scrape. In case if you are not aware, this means I had normal Instagram accounts as scrapers. Not scraper accounts added and would use the follow tool and feed into main accounts as ‘Follow Specific Users’.
Interestingly, I had some scrapers shut down but actually very few captchas/EV and ultimately I was able to cap my limits on all 40 slaves daily easily being 800 specific users fed into the accounts.
I even had less than 20 scraper accounts at some point and still able to keep all 40 slaves full of sources daily. Cuz I was lazy to top it up haha.
Of course, I wanted to scale and decided to hop back on tagname method, took a while to find the correct settings with api/eb to stop losing the accounts. Obvious reasons, it’s free on 70accs+ license and is easier to manage compared to classic method.
So on this particular test Jarvee license on another VPS, I now have more than 400 scraper accounts, multiple proxies and around 100 slave accounts. Those slaves are at 20-30 follows warm up stage at the moment only. And like 30% of the accounts are not able to reach the follow limits already. Of course this is because I hit api/eb scraping limits on scrapers and run out of actions. I know that.
But, how on earth is it possible that 20 or, let’s be generous - 30 scrapers using classic method were able to keep 40 accounts full of sources meaning adding at least 50 new users to each slave account.
Meaning at least 1500 users were being scraped daily by just 30 accounts.
Now, once the tagname method is active, I have 400+ but let’s keep it simple. 400 scrapers, 20 follows per slave currently in its warm up phase, 100 slaves total - that’s 2,000 users failing to be reached daily.
I get it, before I was using slaves to do EB/API calls and that was removing a lot of strain from the scrapers. Allowing them to just do scraping and nothing else.
But how can the difference be so big? It almost doesn’t make sense. I updated the sources to make sure they’re high quality, mobile proxies are spinning well etc.
Like I’m esitmating I would need 2000+ scrapers or more to keep 100 slaves with 100 daily follows actively, how is this possible? lol.
Admittedly, I have seen much less captcha/EV on my actual slaves since all the API/EB actions got down to single digits daily but at this point I’m contemplating on just reversing it all back to classic method as the scrapers expenses starting not to make sense.
How is your experience with this?