Scrapers burning to hell

Same thing here.

We have since changed our packages so that the higher priced plans include better targeting.

Cheapest plan will get virtually no filters. Mid level plan will have some targeting. Top plan will have the targeting we used to use for everyone.

Unfortunately this is the reality now for IG. If the client wants more accuracy, then they have to pay more :frowning:

Yes, Instagram automation isn’t as cheap as it was 2 years ago. It requires more of dedication/costs/trials, and most of the clients understood it tho.

filters will definitely increase API calls but I would recommend doing that using cheap slave accounts that you don’t care about 0.1$ ones then after you get very good sources use them on the main account.

Coming back to sadly report that this did not extend accounts life, in fact I got out less API calls in total compared to what I would if I push them as much as possible in 2 days before they get banned :frowning:

When I wrote this comment I’ve put 25 scraper accounts to do 100d/30h API calls and max 20 errors per day. It has now been 9 days and I lost 95% of those accounts. Majority didn’t reach 500 API calls total before they died. See screenshot below of the last 5 I still have on MP.
image

So on the surface, it looks like they lasted 9 days, but in reality they achieved half as much results as I would have if I pushed them to die in 2 days.

There’s one more thing to mention though, I was getting 20 errors per day on each scraper account. But I think it was false reporting, as I had my scrapers hit the ‘Send to Like Sources’ cap of 800
image

So I assume the errors were false reports, since the accounts tried searching, realised the cap is hit and then stopped reporting an error staying delayed until the next day when it tried again. Resulting in almost no actions? Or those errors were actions, in which case we could consider additional 20 API calls a day.

So. The search goes on xD

Did you notice any difference in using 4G with scrapers compared to datacenter?

I think Datacenter proxies should not be used anymore for scrappers. I for my own only use SR-Proxies and 4G-Rotating and 4G-Static Proxies.

1 Like

what is a SR proxy?

Wouldn’t 4G proxies make the scrapers super expensive to run if you run hundreds of them?

Let’s debunk the myth.
DC proxies work just fine.
4g mobile works too.

Either can work, you just have to pick what gives you the best result/ expense

I use DC proxies for my scrapers running 800+

1 Like

Agreed. Different tools for different situations. There is more than one way to reach an outcome.

If you can run
50- 100 per 4g proxy ($20-50) you could do 20c -$1 per scraper
2-5 per DC proxy ($1-2) you are averaging 20c- 60c per scraper

1 Like

Oh true. I heard people running only 10 scrapers per 4g. Hence thought it was expensive

I use ipv4 proxies myself, but my scrapers die quite fast. So, not sure if its the scrapers or the proxies. Seems most people are having issues

Yes, big difference. I agree with many people above that DC proxies works. But why make your life harder?

You can safely run 50 accs with the right settings. Never needing to worry of maybe getting banned cuz of proxy?

I have already enough to worry about. Fk fixing proxies, adding them and removing all the time, buying separately, testing out suppliers, thinking of cooldown times, losing accounts cuz proxy got too many accs disabled. I mean if you love spreadsheets and really good with tracking this stuff then good for you.

Funnily most people don’t consider 4g cuz it’s more expensive. But for me personally 1 4g proxy is like at least 3x cheaper monetarily, not to mention time sunk into management multiple hundreds of dc proxies lol.

1 Like

bruce how much does ur 4g cost and how many you scrapers you run on each? thanks!

You can get them for around 20$ in the marketplace :slight_smile:

#public-marketplace

1 Like

I’m London based and create my own proxies using Three UK. I use RPI + Huawei Dongle. With a bit of effort anyone can do it. But in case if you want to buy instead then this:

Or simply google mobile proxy sellers. I can’t say for other countries, but UK Three - when it changes IP it bounces all across the country, meaning it doesn’t matter where you’re from you’ll ‘appear’ to be anywhere in the country at random.

I suspect there’s some connection to the nearest tower itself, so bigger city density may have benefit over a smaller city, but I was not able to figure this out so can’t confirm or deny, maybe someone who has an entire network of proxies can clear it up.

Also, got a few people asking for my settings. There’s no secret here. 300-700 (500 average) API calls before they die is not a lot sadly, I wish I could reach higher numbers. That’s why I’ve been trying to do 100d/30h test, no-tag-name method test and pushing to the max within a day test. All so far result in the same API calls before it’s burned.

Just run your scrapers on 3 shifts, make sure they won’t overlap and spread out the actions a lot. if you’re doing 100-300 API calls per day, your scraper won’t have more than 2-4 active sessions per day. Meaning you could potentially do a lot more than 50 scrapers if your proxy rotates every 5min.

Think about it this way - 1440 min (1 day) / 5 = 288 - Unique IP’s, Locations & sessions.

Each session should be safe to handle 5 concurrent connections (actually more but let’s say 5 to be safe). So, in theory, you could run 288 * 5 = 1440 sessions per day safely.

So finally, 1440 sessions / 4 (average connections on acc before your API ceiling is reached) = 360 potential accounts running safely on 1 mobile proxy.

Yes, this is crazy numbers lol. I wouldn’t even go past 100 acc, but 50 is safe imo as long as you spread out your settings enough.

Ofc don’t be stupid, don’t log them all in at once and make sure you don’t have all the sessions overlapping at once, which will burn the accs. This bit takes a bit of practise and effort but 100% achievable with mp settings.

Thanks bruce - yes i have an rpi too but trying to set up on my work office router with dongle, but having trouble setting up… but thanks anyways!

ideally you would want to get a static ip through your ISP, usually they charge a little extra but 100% worth it as you won’t need to reroute your local network each time. Each router is unique as well so local IP routing is the key here.

Once you figure out how to work the router and the manual connection then it’s easy from there on. However, if you’re stuck with the rotating residential IP like majority normal networks are - it’s best to extend the reset time to the longest possible and re-adjust it via router settings each time. This kinda sucks tho as you might end up doing this every 3-5 days, so static ip is much better.

2 Likes

what do you mean to make sure the scrapers are logged into EB? You’re not connecting them with API full emulation and making them connect USING EB ONLY?