Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

This isn't that surprising given how much Doctors are overworked and, often I think, underappreciated.

A concern with something like this though, is to what extent is ChatGPT just telling patients what they want to hear, as opposed to what they need to hear?



Responses to a person would still need to be gated by a medical professional.

I recall in a chat a person getting rather exasperated with a coworker and the person using ChatGPT to generate a friendly/business professional "I don't have time for this right now."

GPT could be used in a similar manner - "Here is the information that needs to be sent to the patient. Generate an email describing the following course of treatment: 1. ... Stress that the prescription needs to be taken twice a day."

The response will likely be more personal than the clinical (we even use that word as an adjective) response that a doctor is likely to give.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: