You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I'm a very amateurish Python user who is scraping FB comments for my PhD. Very grateful for your code which I have been using pretty much with zero issues for the past year or so. But suddenly I've been getting KeyError: 'comments_full' response. I saw another thread on this from Oct 2022, which said it hadn't been fixed but I'd used it then and it worked fine. I've tried following other advice and making sure to use URLs with pfbid etc. Is there anything else to try that you'd recommend? For reference, this is the code I've been using, pretty much exactly what was originally posted:
import facebook_scraper as fs
POST_ID = "[full URL, just ID didn't work for me]"
MAX_COMMENTS = True
gen = fs.get_posts(
post_urls = [POST_ID],
options = {"comments": MAX_COMMENTS, "progress": True}
)
post = next(gen)
comments = post['comments_full'] --> triggers KeyError
The text was updated successfully, but these errors were encountered:
I'm a very amateurish Python user who is scraping FB comments for my PhD. Very grateful for your code which I have been using pretty much with zero issues for the past year or so. But suddenly I've been getting KeyError: 'comments_full' response. I saw another thread on this from Oct 2022, which said it hadn't been fixed but I'd used it then and it worked fine. I've tried following other advice and making sure to use URLs with pfbid etc. Is there anything else to try that you'd recommend? For reference, this is the code I've been using, pretty much exactly what was originally posted:
import facebook_scraper as fs
POST_ID = "[full URL, just ID didn't work for me]"
MAX_COMMENTS = True
gen = fs.get_posts(
post_urls = [POST_ID],
options = {"comments": MAX_COMMENTS, "progress": True}
)
post = next(gen)
comments = post['comments_full'] --> triggers KeyError
The text was updated successfully, but these errors were encountered: