Scrapers VALID but...not working


I use a so-called « strategy » involving some automation ( JV add to the saved posts of my main account posts from people who liked targeted posts by competitors) and some manul work (I then do F/UF on my main account manually everyday +100/-100). The results vary from 20 to 35% of the people following back. Which is a pretty OK increase for my needs.

Recently my main account encouters a lot of errors when trying to use the like tool to extract posts from a list of usernames. « the search took more than 20 minutes and was aborted ». I don’t understand because my scrapers are all VALID and proxy is active too. A JV support guy told me that if I can’t edit the profile infos from JV thant means that my scraper isn’t properly working. Indeed I manage to retrieve the profile info with JV but the software has a warning pop-up telling me there was a problem when accessing the informations. I have no clue pf what the problem could be and how to fix it. Did something similar happen to one of you ? How did you managed to get past this problem ?
Thanks a lot !

1 Like

Whats your ratio for scraper to main?
Does anything happen inside the main accounts browser when you try the action?
What error code does it say?
Have you logged into each scraper to see if they are still alive?
Have you set the scrapers as ‘scraper tags’ and added the option to only use these accounts?

Are you using strict filters? I would try to uncheck all filters to see if the filters are the reason why that error message appears.

can you view the account’s profile picture, bio, name etc in the Edit Profile tab? I don’t get how you get that warning pop-up but you managed to retrieve the profile info.

That warning means your scraper is not able to access the API, you should load the profile info without any issue, and without that warning.

Try to add a new scraper account and add it in a different proxy, and see if you will get the same issue.

The issue is either related to strict filters or to scraping API blocks. One of the solutions that were previously suggested should resolve the issue.