It’s your choice. Some do, some don’t.
If you want bot activities to appear more human-like, some form of rest
would make sense. If it’s a churn-and-burn scraper, it probably doesn’t matter.
It’s your choice. Some do, some don’t.
If you want bot activities to appear more human-like, some form of rest
would make sense. If it’s a churn-and-burn scraper, it probably doesn’t matter.
the delay is way too long and I would recommend Night mode on both main and scraper accounts, the scraper will live longer and the main will look human and avoid blocks and verifications.
I’m also using mobile proxy, pretty stoked to give this a go. Will report back if it increases the results. The tagname method has been easy to setup but last week I lost like 10 scrapers per day per main on average which has been too painful xD And I don’t exactly want to loosen up the filters either so it’s not really sustainable.
When u say lose accounts … Does this mean disable? Or something else?
Well give this classic method a try then … No tagname etc… Just scrape with filters and send it to those accounts that needs to do action …
We’re looking forward to your report, keep us posted
Looking forward to your results!
I added 25 accs for testing for 2 mains, lost 2 scrapers already xD But that was my fault as I turned off all the limits and did 500 api calls in 1h. So no api limits is a no go for sure.
However, now that I’ve set it up with the decent api limits per hour/per day and also spread out the follow actions on each scraper account throughout the day - I start to see how this could be safer.
As before I would throw in 10 accs and they would randomly be selected to be used sometimes getting hammered. Where’s with this method you can choose to do 10 posts to be scraped every 1-2h by each acc using follow filters on each account which ultimately gives us more control which should mean safer.
Time will tell but it does make sense as often more control = better results
So you’re using “classic method” not “tagname method” now? I’m thinking of switching to “classic” as well, however I use DC proxies currently, while you’re using 4G proxies I think.
sadly seems it’s the same. More effort/work to set it up and the results don’t seem to be any better. Even with spread out actions. I’ll continue to mess around with it since I’ve already setup the settings, but considering out of 25 scrapers for 2 main accounts (3 already removed from MP in this list) + 6 more got into captcha/ev/pv the first day, total 9 accounts lost out of 25. Each main did 200~ actions.
This is “Classic method” with 4g mobile proxies, right?
This is your result on the 1st day of test?
Did you warm up the 25 scrapers or hit them hard from a cold start?
How many API calls per hour/day are the limits?
How many scrapers per IP?
Were these scrapers built on 4G or DC?
Were they built with phone numbers or just emails?
Do the scrapers have profiles and images? Or blank?
Are the scrapers “API Only” or “Scrape with EB when possible”?
Are the scrapers running 24 hours or they have a cool-down overnight?
I know, it’s a lot of questions, but many things can influence it. I’m very interested, if you would like to reply. - Thanks!
you’re overthinking it mate
Which delay do you suggest?
Thanks
Test to find the delay that works for you. It depends on your goals and your configuration.
If you’re looking for a maximum of 200 API calls per day, you can adjust the per-hour delay (eg 30 to 50 API calls) and then set a daily (200) trigger to put in a long delay so it rests until the next day. There are also millisecond delays you can insert based on API limits, to slow them down.
Someone else’ settings may not work for you. You have to try your own and adjust. My delays are long, some would say “excessive”, but they suit me.
I applied even stricter API limits since I’ve lost 6 accs the same day, so all in all this method is no better than the tag name method just takes a little more effort to set up. I’ll continue to mess around with the settings and see if I can reduce the ban rate ever so slightly.
I appreciate you trying to help the community but I’m sorry I find it hard to believe. You say you use 2 scrapers per 10 mains and get EV/PV every week or so. And no API limits? No way I can believe that.
Let’s say I’m completely useless with my settings here and did all the mistakes I could (which is unlikely) - I lost 25 scraper accs in 2-3 days per 2 main accounts. And I had very strict API limits on the 2nd day onwards.
edit: In fact, now that I’m looking at the stats and compared to previous weeks when I used tagname method - this method seems to be worse in terms of account life. Tagname method having 25accs I could stretch to 7 days before needing to replace all of them so I will circle back to it going forward. It was a fun little debunking experience tho lol.
You do not need to believe lol I am telling you what I am experiencing now… 1-2 scrapers supporting 10 main accounts … I am only asking people to try the classic method rather than the tagname… I never use any API limit or delay on scrapers, and only gets EV or PV every week that’s all… Rarely get disable but if it does, no big deal… Make again
How many follows a day each of your 10 main accounts are doing?
It sounds fantastic. We should document your setup in detail. I’d love to know the difference between your setup and others. You are doing something right, what is it?
do you use “enable partial api emulation” in social platforms settings?
happy for you if it truly works. But I can also say I have 1 scraper for 100 main accounts and I get 1 PV per week. Is it true? Well is it not true? So without doing anything to back it up what we say it’s just words
I would love not to need to replace them as often as I do. I also use mobile 4g proxy and it’s rotating with new IP. So it’s not a proxy problem.
If you can share something that you think is making such big difference I’m all ears. But for now changing the method to the old one didn’t improve my results much in fact it ended up being worse.
I also tested the “classic method” to send to specific users but it was no different for me than the tagname method. Scrapers still got captcha/banned etc. around the same time as with the tagname method