[SOLUTION] API Calls for Scrapers

your scraper to main ratio is way too low. That’s why you burn so many of them.

1 main acc = 5 scrapers minimum in my experience. And you need to have a good proxy for the scrapers if you’ll keep it at that, I would recommend mobile rotating every 5min so you can put many scrapers on 1 proxy and forget about almost.

I use 10+ scrapers per main account and mine last 1-7 days as long as I can keep that ratio 1:5+ I noticed the accounts quality doesn’t matter much. So buying decent cheapest options will be good, I bought expensive accounts in the past and even created on my own with mobile proxy and mobile phone which didn’t improve the accs lifetime at all.

But yea, ramp up those numbers and you’ll only need to top up every other day.

3 Likes

Most users that play around with scrapers don’t analyze the necessary API calls required to extract the desired informations, from the profiles.

By analyzing and understanding them, one would be able to drastically reduce the API calls… Behind the buttons guys, there are requests. Don’t look at the buttons :eyes:

1 Like

I making 1 request in 1sec and after 30 requests i have delay 25sec and in circle after 480 requests i have delay 1h but all my accounts get locked I use scrapers which i buy here on forum,
anyone please help? :((((

i use api calls for user posts, and tag posts, i have delay between requests but still get locked, can you please help?

We don’t have the same proxies / accounts / softwares / settings. So i really can’t help, whatever i’ll say, it will be very inaccurate/false in your case.

The only way to get out of it, is to test it out with many A/B/C/D different templates, and see what’s the best. If D wins, then do a D/E/F/G test. Untill you find the “costless/resultfull” strategy

you can suggest some good templates?

I don’t use the same tools, it’s all home-made here, so once more, i can’t say. But when i scraped on Jarvee, i used to pay attention to what means making an API call, and what would be the results.

Just by reading this, i can tell that your account are doing the same thing over and over, without randomization. And it’s not only about the settings… There’s also the rest !

I can make delay between e.g. 1-4min this type of mixed can help?
what you think with ‘‘there’s also the rest’’ ?

Definitely. And whatever you can randomize, do so.

As i said, you can tweak “proxies / accounts / softwares / settings” :slight_smile:

Hey Bruce

Correct me if I am wrong but the scrapers aren’t lasting any longer - you just have more so they take longer to hit the 500 ceiling (the calls are distributed amongst more scrapers). So having 5:1 is great as it saves time etc - but from our testing it doesn’t make the scrapers live longer. It just delays their death as they take longer to hit the 500 cap.

1 Like

Missed ya brotha! Yeah we are rebuilding our scraper right now, will keep you posted and get feedback from you

1 Like

I use specific users instead of sources.
and scraping with external service that also connect to JV. and analyze results.

So I can have 1/20 times less scraping power.

What is the external service you use that connects to Jarvee?

forgive me being ignorance because I am still learning but I can give some suggestion here… I have never had any problems with scrapers from the start of my IG journey (1-2 years) even though sometimes scrapers might get disable but this rarely happen too… this is because i am using the traditional method where the scrapers scrape and send to main account to do action such as follow, like etc… I reckon that you want the main to have all the filtering and stuff so you can have the follow back ratio and some other data but in my humble opinion, even that’s important you’re gonna have headache in the long run with dying scrapers which till these days I’m not sure what do you mean by dying scrapers (I guess the account get disable) … I can have like 1 scraper supporting 10 main accounts with no problem at all… So I am really scratching my head here why people is so persistent in following the new method of scraping where you need to spend so much time, money and energy in scrapers account… So it’s entirely up to you to think which worth your effort… The old method of scraping or the new method… If I miss out something, please do let me know :slight_smile:

3 Likes

You mean you are using the “Specific Users” for sources?

@cyberminator

By the “traditional method” you are describing, do you mean using “Specific Sources” and have the scrapers send to main accounts?

Yes guys … You are right … Sending it to specific user for action… LoL I kept seeing post like scrapers dying… Need to buy scrapers… Which providers too choose… And I was like am I doing anything wrong, why I never have the need to search for providers for scrapers…

Thats nice for you. How many scrapers you currently have right now? And do you use scraping delays?

I never use any to be honest, hence sometimes in a week, I get PV and captcha… but if you want to add delay you can though… Im just telling what works for me… :slight_smile:

Haha nice to see ya too! :heart: yea, different times now lol

Denis1 already explained to you what you need to do. You can find a variety of settings/scrapers setup guides here. Key takeaway is, no matter how hard you try you won’t make it stretch further than 1 week, at least in my experience anyway.

What I explained earlier is very simple, you can keep twisting your head trying to find the best settings and do segmented tests to figure out the best option. Maybe you’ll gain an extra day or two or your scrapers. BUT do not expect them to last more than 7 days, period.

Or do what I did. Find the cheapest seller who has the most decent cheap throwaway accounts and just replace them on the first checkpoint. If you pay sub below 0.3$ per account and it lasts 1-7 days and just replace it on the first checkpoint it isn’t that expensive. Especially if you keep than 1:10+ ratio, then your ban rate is reduced quiet a bit. Ig has always been a numbers game. You can’t outsmart them, just manage your cost and price accordingly.

Well, purely from the technical/analytical point of view you’re right. However, the way I deal with it is - I want to hit my 200 actions per day with each main acc, no matter what. So if I have 5 or less scrapers per main, I’ll burn them in 1 day most of the time very rarely do they go up to 2 days. If I have 10+ - usually I do 1:20 atm, that often lasts me up to 7 days before I need to top up and burn all of them out.

So, whilst I haven’t tracked the exact api calls - I am lazy :smiley: if I do below 10 scrapers I burn 5 a day on average. that’s 30-35 per week, if I put 20 from the getgo, they tend to last 7 days-ish. That’s 20 per week. So the more I have on rotation randomly doing stuff the less I get them banned.

Being the lazy guy that I am - I plan to have 30-40 scrapers per main soon (to reach the maximum that one mobile proxy can handle). If I can have 30-40 accs losing 20 on average per week and topping up every 7 days or so, it’s so so much easier than doing it daily with 5 accs, not reaching limits etc

so what’s your secret then? :slight_smile: I use the same method to send to the main acc using specific users for action and so, I assume, most of the forum. You saying you never have scrapers disabled, in my humble opinion, sounds like a ‘look at my big pp that nobody has’ moment. Hope you can back it up with proof and examples though, reducing the ban rate by even 30-50% could go a long way here.