ooli@lemmy.world to science@lemmy.worldEnglish · 2 days agoScientists Have Confirmed the Existence of a Third Form of Magnetismwww.popularmechanics.comexternal-linkmessage-square59fedilinkarrow-up1260arrow-down13
arrow-up1257arrow-down1external-linkScientists Have Confirmed the Existence of a Third Form of Magnetismwww.popularmechanics.comooli@lemmy.world to science@lemmy.worldEnglish · 2 days agomessage-square59fedilink
minus-squareNum10ck@lemmy.worldlinkfedilinkEnglisharrow-up1arrow-down56·2 days agochatgpt can eli5 summarize. who knows if its accurate enough
minus-squareWindex007@lemmy.worldlinkfedilinkEnglisharrow-up16·2 days agoWhat value is a summary when you fully acknowledge that you can not trust it for accuracy?
minus-squareNum10ck@lemmy.worldlinkfedilinkEnglisharrow-up1arrow-down7·2 days agoi agree, but what can you trust for accuracy in these times?
minus-squareWindex007@lemmy.worldlinkfedilinkEnglisharrow-up7arrow-down1·2 days agoPeople who are experts in the subject. Propegandists thrive by trying to convince people that they can’t trust anyone, because it makes foolish people believe that every voice carries equal merit.
minus-square7toed@midwest.sociallinkfedilinkEnglisharrow-up4·2 days ago How many Rs are in orange There are no Rs in orange Yeah I’ll be using chatGPT for education, what could go wrong
minus-squaresnooggums@lemmy.worldlinkfedilinkEnglisharrow-up20arrow-down1·edit-22 days agoChatGPT cam bullshit about anything, but odds are anything complex will be wrong. Even simple things are probably wrong.
minus-squareDarkThoughts@fedia.iolinkfedilinkarrow-up6·2 days agoI would not even trust it to summarize nuanced details in a lengthy article, let alone something science related (especially about new discoveries).
chatgpt can eli5 summarize. who knows if its accurate enough
What value is a summary when you fully acknowledge that you can not trust it for accuracy?
i agree, but what can you trust for accuracy in these times?
People who are experts in the subject.
Propegandists thrive by trying to convince people that they can’t trust anyone, because it makes foolish people believe that every voice carries equal merit.
Yeah I’ll be using chatGPT for education, what could go wrong
ChatGPT cam bullshit about anything, but odds are anything complex will be wrong.
Even simple things are probably wrong.
I would not even trust it to summarize nuanced details in a lengthy article, let alone something science related (especially about new discoveries).