Drew@sopuli.xyzM to LinkedinLunatics@sh.itjust.works · 7 months agoYou need to be in the ER to get the vibes rightsopuli.xyzimagemessage-square33linkfedilinkarrow-up119arrow-down10
arrow-up119arrow-down1imageYou need to be in the ER to get the vibes rightsopuli.xyzDrew@sopuli.xyzM to LinkedinLunatics@sh.itjust.works · 7 months agomessage-square33linkfedilink
minus-squareshalafi@lemmy.worldlinkfedilinkEnglisharrow-up0·7 months agoDo people really do that?! I’ve just used it as a starting point for something totally unfamiliar, reworked it to suit, made sure I understood everything it spit out. I cannot imagine ChatGPT spitting out working code.
minus-squareHackerJoe@sh.itjust.workslinkfedilinkarrow-up1·7 months agoThey do. The result is usually as expected. Either full of security holes or the recipe site is advertising cyanide ice cream.
Do people really do that?! I’ve just used it as a starting point for something totally unfamiliar, reworked it to suit, made sure I understood everything it spit out. I cannot imagine ChatGPT spitting out working code.
They do. The result is usually as expected. Either full of security holes or the recipe site is advertising cyanide ice cream.