cross-posted from: https://lemmy.zip/post/1386796
Archived version: https://archive.ph/F9saW
Archived version: https://web.archive.org/web/20230812233105/https://www.bbc.co.uk/news/technology-66472938
cross-posted from: https://lemmy.zip/post/1386796
Archived version: https://archive.ph/F9saW
Archived version: https://web.archive.org/web/20230812233105/https://www.bbc.co.uk/news/technology-66472938
They didn’t figure anything out. There’s no sentience in the algorithm, only the creators of said algorithm. It only chose content based on input. So it all revolves around the choices of the article’s author.
Same thing with the woman who was pregnant, the algorithm gave choices based on the user’s browsing history. It made the connection that the choice of product A was also chosen by pregnant mothers, therefore the shopper might be interested in product B which is something an expecting mother would buy.
Ugh, I was agreeing with you, and you go pedant. Come on, you should know “figure out” doesnt necessarily imply sentience. It can also be used synonymously with, “determine.”
Sorry, I misunderstood your tone. Apologize for going all pedantic…it’s a character flaw.
I believe in case of the pregnant women she was offered diapers and stuff. Based on food she bought. So it’s no simply “you both diet coke, maybe try diet chocolate?”. In case of Netflix there’s no " A show only gay people watch" so her complaints are silly.