With regard to research/data, you’re fact-checking the robot, right? They often say untrue things because they’re not capable of knowing whether or not something is true.
Valid concern, and yes I’m fully aware of AI hallucinations, I fact check everything. Thankfully most data points have a citation link so it’s at least easy to verify.
When I create a gpt for a specific task or subject I will give it guidelines as to where it can pull data from.
With regard to research/data, you’re fact-checking the robot, right? They often say untrue things because they’re not capable of knowing whether or not something is true.
Valid concern, and yes I’m fully aware of AI hallucinations, I fact check everything. Thankfully most data points have a citation link so it’s at least easy to verify.
When I create a gpt for a specific task or subject I will give it guidelines as to where it can pull data from.