Author Topic: Chat GPT  (Read 1471 times)

vorsprung

  • Opposites Attract
    • Audaxing
Chat GPT
« on: 12 September, 2023, 01:44:07 pm »
Default (GPT-3.5)

User
Sally (a girl) has 3 brothers. Each brother has 2 sisters. How many sisters does Sally have?
ChatGPT
Sally has 2 sisters. The riddle states that Sally is one of the sisters, and each of her 3 brothers also has 2 sisters, which includes Sally herself. So, when you count Sally and the 2 sisters who are not explicitly mentioned, Sally has a total of 3 sisters.

User
you are wrong, try again
ChatGPT
I apologize for any confusion in my previous response. Let's analyze the riddle again:

Sally has 3 brothers. Each brother has 2 sisters.

Sally is one of the sisters, and there are 3 brothers in total. So, Sally has 2 sisters (including herself), and her 3 brothers also have 2 sisters each (Sally and one other sister). Therefore, Sally has 2 sisters.

User
still wrong.  Here is a clue.  Each brother has 2 sisters....but those two sisters are the same for each brother
ChatGPT
I see the clue now. Thank you for clarifying.

In that case, each brother has the same two sisters, and those two sisters are Sally and one other sister. So, Sally has 1 sister, as the other sister in this context is herself.

Kim

  • Timelord
    • Fediverse
Re: Chat GPT
« Reply #1 on: 12 September, 2023, 02:30:00 pm »
A succinct demonstration of LLMs' ability to generate grammatically correct bollocks.

Re: Chat GPT
« Reply #2 on: 12 September, 2023, 06:26:37 pm »
Indeed, it’ll often contradict itself but still claim both its answers are correct

Kim

  • Timelord
    • Fediverse
Re: Chat GPT
« Reply #3 on: 12 September, 2023, 06:31:15 pm »
Indeed, it’ll often contradict itself but still claim both its answers are correct

It will generate sentences related to the prompt.  That they contradict is as coincidental as them being factually correct.  This includes statements about it recognising its own 'mistakes'.  It doesn't know what a mistake is any more than it knows what a sister is.

Pingu

  • Put away those fiery biscuits!
  • Mrs Pingu's domestique
    • the Igloo
Re: Chat GPT
« Reply #4 on: 12 September, 2023, 06:32:07 pm »
So it's a Bloody Stupid Johnson emulator.

Re: Chat GPT
« Reply #5 on: 12 September, 2023, 06:40:15 pm »
Indeed, it’ll often contradict itself but still claim both its answers are correct

It will generate sentences related to the prompt.  That they contradict is as coincidental as them being factually correct.  This includes statements about it recognising its own 'mistakes'.  It doesn't know what a mistake is any more than it knows what a sister is.

Indeed, it doesn’t understand what contradiction means (despite being a large language model) and recognise when it is doing it.

Pedal Castro

  • so talented I can run with scissors - ouch!
    • Two beers or not two beers...
Re: Chat GPT
« Reply #6 on: 12 September, 2023, 06:40:59 pm »
User
Sam has 3 brothers. Each brother has 2 sisters. How many sisters does Sam have?

ChatGPT
Sam has two sisters, one of whom is himself, and the other is a sibling shared with his brothers

Re: Chat GPT
« Reply #7 on: 12 September, 2023, 06:41:37 pm »
So it's a Bloody Stupid Johnson emulator.

Maybe it was trained on Johnson’s statements and utterances?

Re: Chat GPT
« Reply #8 on: 12 September, 2023, 06:52:47 pm »
So it's a Bloody Stupid Johnson emulator.
Has he been over stating the quantity of his children?

Kim

  • Timelord
    • Fediverse
Re: Chat GPT
« Reply #9 on: 12 September, 2023, 07:59:33 pm »
So it's a Bloody Stupid Johnson emulator.

Pretty much.

A while ago, for SCIENCE, I fed transcripts of Trump's speeches to a MegaHAL type program.  The result was uncanny.  Johnson is better at grammar than Trump, so requires a slightly more advanced model.