My theory is they have determined you to be a good human, so they want you to do more work for them. The more you “fail” the more work they get out of you.
I’ll do the first one, fail sometimes, do the 2nd (because I can’t rule out a mistake, sometimes my attention wanders) if I get 3rd, I grab the url of the site I’m trying to visit and go to Wayback Machine (or just say fuckit and close the tab).
I get them a lot, because I stay on a VPN, and I know bots and script kiddies use VPNs and trigger server defense systems. So I don’t mind doing it every now and then. Lately I’ve been noticing it’s just a checkbox most of the time. Check it spins about 2 seconds says ‘congrats on being a meatbag’ and loads the page. I may be paraphrasing.
Gaaah! Wait! What’s the answer?? Sometimes I click that square, but other random times I don’t.
The answer is the time and decision making, not correct choices.
There are 5 lights.
You’re determining the answer by training Google’s AI
If we’re supposed to be telling the AI what’s right, why do we so often get it “wrong?”
My theory is they have determined you to be a good human, so they want you to do more work for them. The more you “fail” the more work they get out of you.
I’ll do the first one, fail sometimes, do the 2nd (because I can’t rule out a mistake, sometimes my attention wanders) if I get 3rd, I grab the url of the site I’m trying to visit and go to Wayback Machine (or just say fuckit and close the tab).
I get them a lot, because I stay on a VPN, and I know bots and script kiddies use VPNs and trigger server defense systems. So I don’t mind doing it every now and then. Lately I’ve been noticing it’s just a checkbox most of the time. Check it spins about 2 seconds says ‘congrats on being a meatbag’ and loads the page. I may be paraphrasing.