this post was submitted on 10 Jul 2023
3 points (63.6% liked)

No Stupid Questions

35671 readers
1553 users here now

No such thing. Ask away!

!nostupidquestions is a community dedicated to being helpful and answering each others' questions on various topics.

The rules for posting and commenting, besides the rules defined here for lemmy.world, are as follows:

Rules (interactive)


Rule 1- All posts must be legitimate questions. All post titles must include a question.

All posts must be legitimate questions, and all post titles must include a question. Questions that are joke or trolling questions, memes, song lyrics as title, etc. are not allowed here. See Rule 6 for all exceptions.



Rule 2- Your question subject cannot be illegal or NSFW material.

Your question subject cannot be illegal or NSFW material. You will be warned first, banned second.



Rule 3- Do not seek mental, medical and professional help here.

Do not seek mental, medical and professional help here. Breaking this rule will not get you or your post removed, but it will put you at risk, and possibly in danger.



Rule 4- No self promotion or upvote-farming of any kind.

That's it.



Rule 5- No baiting or sealioning or promoting an agenda.

Questions which, instead of being of an innocuous nature, are specifically intended (based on reports and in the opinion of our crack moderation team) to bait users into ideological wars on charged political topics will be removed and the authors warned - or banned - depending on severity.



Rule 6- Regarding META posts and joke questions.

Provided it is about the community itself, you may post non-question posts using the [META] tag on your post title.

On fridays, you are allowed to post meme and troll questions, on the condition that it's in text format only, and conforms with our other rules. These posts MUST include the [NSQ Friday] tag in their title.

If you post a serious question on friday and are looking only for legitimate answers, then please include the [Serious] tag on your post. Irrelevant replies will then be removed by moderators.



Rule 7- You can't intentionally annoy, mock, or harass other members.

If you intentionally annoy, mock, harass, or discriminate against any individual member, you will be removed.

Likewise, if you are a member, sympathiser or a resemblant of a movement that is known to largely hate, mock, discriminate against, and/or want to take lives of a group of people, and you were provably vocal about your hate, then you will be banned on sight.



Rule 8- All comments should try to stay relevant to their parent content.



Rule 9- Reposts from other platforms are not allowed.

Let everyone have their own content.



Rule 10- Majority of bots aren't allowed to participate here.



Credits

Our breathtaking icon was bestowed upon us by @Cevilia!

The greatest banner of all time: by @TheOneWithTheHair!

founded 1 year ago
MODERATORS
 

Recently, I found myself questioning the accuracy of a diagnosis provided by a doctor I visited. Surprisingly, an AI seemed to offer a more insightful assessment. However, I understand the importance of not solely relying on AI-generated information. With that in mind, I'm eager to discover a reputable online platform where I can seek medical advice. Ideally, I hope to find a community where I can obtain multiple opinions to make a more informed decision about my health. If anyone could recommend such a site, I would greatly appreciate it.

top 9 comments
sorted by: hot top controversial new old
[–] [email protected] 6 points 1 year ago (1 children)

Replace the word AI with "fancy autocomplete" and see how comfortable you feel with its diagnosis.

That's effectively what you've done. How see another doctor for a second opinion if you're concerned; this isn't something to leave to the internet.

[–] [email protected] -1 points 1 year ago (1 children)

Comparing current LLMs with autocomplete is stupid. An autocomplete can't pass law or biology exams in the 90th percentile like GTP-4 can.

[–] [email protected] 1 points 1 year ago (1 children)

Actually, by your reasoning, an autocomplete can pass law or biology exams because that's all that GPT is. It's a very fancy autocomplete, but it doesn't know anything. It is not an AGI. It is a limited tool designed to generate text in response to a prompt: a very fancy autocomplete, but an autocomplete nonetheless.

[–] [email protected] -1 points 1 year ago* (last edited 1 year ago)

You don't have any idea of how GPT works. Read about it and then we can talk.

[–] [email protected] 4 points 1 year ago

The ICD is the reference for diagnosis criteria. So you can check and verify diagnosis for a personal assessment there.

If you question a doctors assessment, you should get a second doctor's opinion.

Dunno where you live, but in Germany you have a right to a second opinion, and for stuff like operations the Krankenkasse will even recommend getting them. They, at least mine, also have online doctor services.

[–] [email protected] 3 points 1 year ago (1 children)

There is no substitute for a real doctor. You can get a second opinion from someone else. And should.

That said I think mayoclinic.org is fairly reliable source for information.

If it is something that can be remotely diagnosed, you might try Teledoc.com.

[–] [email protected] 3 points 1 year ago

I second this. AI does not have the true depth of understanding or the heuristic experience that a physician does, and it doesn't know what questions to ask in the first place. There are a number of conditions that can only be caught and diagnosed if the correct questions are asked, and you can't rely on just feeding a machine all the symptoms you have because some of them may not be related to the problem at hand. Actually going to a physician and getting a physical exam and any lab work they might order is immensely valuable for making an accurate diagnosis.

[–] [email protected] 3 points 1 year ago

I've never tried it, but I think Amazon now has the ability to do virtual medical visits in some places.

Maybe that could work as a second opinion?

[–] [email protected] 3 points 1 year ago

AI isn't actually doing a diagnosis. It's just trying to predict what to say based on what other people have said online. Often the AI generated content/answers are just wrong. Maybe an AI properly trained on symptoms could give accurate results but that isn't really what is happening with most of these AI models.

The AI answer will sound more detailed and involve more fluff. A doctor will often be shorter on details but just because they don't have time for the fluff. Even doctors think AI answers sound better due to the fluff. It doesn't mean the AI is actually better or something you can trust.

load more comments
view more: next ›