I thought I could stump ChatGPT by asking.......But...

Discussion in 'Science' started by ryobi, Aug 12, 2023.

  1. fmw

    fmw Well-Known Member

    Joined:
    Aug 21, 2009
    Messages:
    38,358
    Likes Received:
    14,781
    Trophy Points:
    113
    I didn't say technology can't do things. I said that computers aren't intelligent and never will be despite the utility of the software driving them. The key term surrounding intelligence is sentience. It doesn't exist in machines and will never exist in machines no matter how amazing the things are that they can do. It is a matter of science. I'm not against the technology. I am only against calling it intelligence and for good reason.
     
  2. Patricio Da Silva

    Patricio Da Silva Well-Known Member Donor

    Joined:
    Apr 26, 2020
    Messages:
    32,008
    Likes Received:
    17,317
    Trophy Points:
    113
    Gender:
    Male
    You have to fact check ChatGPT on the 10%. Butr, once fact checked, it's an great tool.
     
    WillReadmore and Josh77 like this.
  3. Patricio Da Silva

    Patricio Da Silva Well-Known Member Donor

    Joined:
    Apr 26, 2020
    Messages:
    32,008
    Likes Received:
    17,317
    Trophy Points:
    113
    Gender:
    Male
    I think AI should be renamed as SI, simulated intelligence. You are correct, a machine is a dead thing, it will never have intellect, which is the wrong word, the right word is 'consciousness' or 'self-awareness'.
     
  4. bringiton

    bringiton Well-Known Member

    Joined:
    Mar 11, 2016
    Messages:
    11,866
    Likes Received:
    3,117
    Trophy Points:
    113
    I think you are deceiving yourself. ChatGPT is not intelligent, and was not designed to be intelligent; it was designed to look intelligent, and it does that very well. But there is no reason a computer system could not be designed to actually be intelligent.
    Never is a long time. The self-driving software is getting close to sentience.
    It is a matter of definitions. What are sentience, consciousness, and intelligence? Why can't they be instantiated in software?
     
    LiveUninhibited likes this.
  5. WillReadmore

    WillReadmore Well-Known Member

    Joined:
    Nov 21, 2013
    Messages:
    59,935
    Likes Received:
    16,455
    Trophy Points:
    113
    It's really important to identify what sources the AI is using and to otherwise verify what is produced. Some AIs do identify what sources they are using.

    A recent podcaster asked an AI about the current activities of a Russian dissident reporter.

    The AI said she was dead (which was very much incorrect, as she was highly active).

    On inspection, the AI was using one single source for the answer - TASS.

    I think this is one of the serious issues with AI joining our information sources.

    If I were a political party desperate to mislead people, I might create an AI that is perfectly fine outside of developing information only from sites in political agreement.
     
  6. WillReadmore

    WillReadmore Well-Known Member

    Joined:
    Nov 21, 2013
    Messages:
    59,935
    Likes Received:
    16,455
    Trophy Points:
    113
    Well, the point with AI is that it is programmed to learn. What is learned isn't from the programmer. AIs are doing an increasingly good job of learning.

    Over time, it's going to be harder and harder to tell if what you are getting is from a sentient being or an AI. And, that includes emotion and all.
     
    bringiton likes this.
  7. LiveUninhibited

    LiveUninhibited Well-Known Member

    Joined:
    Sep 26, 2008
    Messages:
    9,684
    Likes Received:
    2,991
    Trophy Points:
    113
    Philosophically, the only intelligence and emotion we know from first-hand experience is our own. We can assume it's the same for other humans because they are very similar to ourselves with the same parts and functions. If we encounter an alien or artificial intelligence that claims to have emotions, it would be hard to be sure one way or the other without a detailed understanding of them. And claiming a detailed understanding of future technology or other unknowns is illogical. We can assess today's AI and conclude it is a machine without real emotions. But to take the next step and say machines that feel are impossible, we can't be certain of that. We are basically machines, albeit incredibly complex biological ones. But as far as we can tell, it's all just chemistry either way, so I see no logical reason non-biological machines could never have emotions.
     
    Last edited: Aug 29, 2023
  8. robini123

    robini123 Well-Known Member

    Joined:
    May 8, 2004
    Messages:
    13,701
    Likes Received:
    1,583
    Trophy Points:
    113
    Gender:
    Male
    I asked ChatGPT if ideological homology is compatible with democracy? The answer it gave was actually good. It pointed out the obvious conflicting concepts of democracy and ideologically homogenous societies. It said the closest to this would be Iceland with a relatively homogeneous society with a functioning democracy, or more specifically a parliamentary republic.
     
  9. WillReadmore

    WillReadmore Well-Known Member

    Joined:
    Nov 21, 2013
    Messages:
    59,935
    Likes Received:
    16,455
    Trophy Points:
    113
    Our own biological emotions are an evolutionary development implemented by our meat computers.

    We did better with emotions.

    I really don't know why an artificial brain couldn't do the same.
     
    LiveUninhibited likes this.
  10. WillReadmore

    WillReadmore Well-Known Member

    Joined:
    Nov 21, 2013
    Messages:
    59,935
    Likes Received:
    16,455
    Trophy Points:
    113
    Yes, a parliamentary republic.

    I like that direction, as it precludes national leadership by someone who has no evident experience of success of ANY kind in government.

    It also provides a method of correction in leadership that doesn't require waiting years for an election or the catastrophe of impeachment.
     
    LiveUninhibited and robini123 like this.
  11. dairyair

    dairyair Well-Known Member

    Joined:
    Dec 20, 2010
    Messages:
    78,957
    Likes Received:
    19,952
    Trophy Points:
    113
    Gender:
    Male
    So it's informative and educational.
     
  12. dairyair

    dairyair Well-Known Member

    Joined:
    Dec 20, 2010
    Messages:
    78,957
    Likes Received:
    19,952
    Trophy Points:
    113
    Gender:
    Male
    Why is sentience required for intelligence?
     
  13. gfm7175

    gfm7175 Well-Known Member

    Joined:
    Oct 2, 2018
    Messages:
    9,503
    Likes Received:
    4,833
    Trophy Points:
    113
    Gender:
    Male
    The exact opposite of that, yes.
     
  14. dairyair

    dairyair Well-Known Member

    Joined:
    Dec 20, 2010
    Messages:
    78,957
    Likes Received:
    19,952
    Trophy Points:
    113
    Gender:
    Male
    That's republican education then.
     
  15. bringiton

    bringiton Well-Known Member

    Joined:
    Mar 11, 2016
    Messages:
    11,866
    Likes Received:
    3,117
    Trophy Points:
    113
    That describes ChatGPT and some other systems. It does not describe all possible AI systems. Self-driving systems are not designed to simulate intelligence but to actually figure things out in the real world. They are not conscious yet, but it is still early days.
    I see no reason a machine could not be conscious and self-aware. You are using the word, "never" without seeming to know what it means. Think about how much AI has advanced in the last 30 years. What will it be like in another 30 years? How about 300 years? 3000 years? Think about how much aviation advanced between Kitty Hawk in 1903 and the space program of 1983. Only a fool would claim they can predict the limits of technology 100 years in the future, let alone 1000 or 10000 years.
     
  16. fmw

    fmw Well-Known Member

    Joined:
    Aug 21, 2009
    Messages:
    38,358
    Likes Received:
    14,781
    Trophy Points:
    113
    Because self awareness is a basic property of thinking intelligently.
     
  17. fmw

    fmw Well-Known Member

    Joined:
    Aug 21, 2009
    Messages:
    38,358
    Likes Received:
    14,781
    Trophy Points:
    113
    Machines are not alive. They cannot think. They can process.
     
  18. bringiton

    bringiton Well-Known Member

    Joined:
    Mar 11, 2016
    Messages:
    11,866
    Likes Received:
    3,117
    Trophy Points:
    113
    I see no evidence that machines are inherently incapable of self-awareness, nor do I see any evidence that self-awareness is necessary to intelligence. Intelligence is the ability to understand. To understand a phenomenon is to have an accurate internal model or representation of it that is simpler than the phenomenon itself. An internal model or representation is accurate if it can reliably predict the phenomenon in question. None of that involves self-awareness, and none of it seems to be inherently impossible for machines. Just because ChatGPT was designed to simulate intelligence rather than implement it doesn't mean systems can't be designed to implement it.
     
    LiveUninhibited likes this.
  19. bringiton

    bringiton Well-Known Member

    Joined:
    Mar 11, 2016
    Messages:
    11,866
    Likes Received:
    3,117
    Trophy Points:
    113
    Please provide a definition of "alive" that includes all possible natural life forms but not computer viruses.
    What is thinking but processing information?
     
  20. fmw

    fmw Well-Known Member

    Joined:
    Aug 21, 2009
    Messages:
    38,358
    Likes Received:
    14,781
    Trophy Points:
    113
    It is programmed to expand its database. The clever part is the software's ability to put the right things together from its database searches. That is not learning.

    I didn't suggest that AI is not clever. It is. But it is not intelligence. The programmer has the intelligence.
     
  21. LiveUninhibited

    LiveUninhibited Well-Known Member

    Joined:
    Sep 26, 2008
    Messages:
    9,684
    Likes Received:
    2,991
    Trophy Points:
    113
    With current technology, I'd agree. That just doesn't mean it's inevitably forever true. Today's machines are still simple compared to human minds, but that does not mean the same materials, but with much more technology, couldn't fully replicate what happens in the human mind. If not, why not? Aren't we just biological machines that operate according to chemistry? Seems you'd have to rely on faith to claim otherwise.
     
    Last edited: Aug 30, 2023
  22. Patricio Da Silva

    Patricio Da Silva Well-Known Member Donor

    Joined:
    Apr 26, 2020
    Messages:
    32,008
    Likes Received:
    17,317
    Trophy Points:
    113
    Gender:
    Male
    I get your logic, it's basically the logic of Moore's law that if you walk towards a goal post, you will eventually hit it. But consciousness, self awareness, has a spiritual basis, and that basis does not exist in time and space, which is to say, there is no linear relationship between life/organic matter, and no-life/inorganic matter. One cannot reach the other because they do not exist in the same dimension.

    No machine can ever become 'conscious'. Failure to understand this is a failure to understand the essence of what consciousness is. If you can't understand this, take LSD and it will hit you like a ton of bricks, that life has a spiritual basis that no machine can ever reach.

    There is no linear relationship between simulated consciousness and 'life'. Life is self awareness, but it doesn't necessary require it. Animals are alive, but have very little self awareness, if any at all. A machine will never be 'alive' in the sense of life, let alone be 'conscious', therefore, moore's law is inapplicable.

    Now, can machines interact with organic matter? Cyborgs are possible. But it will be the organic side of the equation that harbors consciousness, not the machine part.

    I know this mainly because of personal out of body experiences. Should that ever happen to you, then you will know that the essence of your beingness is something separate from the physical body, that the physical body is a 'vessel'. Inorganic matter will never attract a spiritual being to inhabit it, there must be sufficient organics present for it to happen.

    Machines can, indeed, become very sophisticated in their powers of simulation, and it will fool many, but it will always be 'simulation'.
     
    Last edited: Aug 30, 2023
  23. WillReadmore

    WillReadmore Well-Known Member

    Joined:
    Nov 21, 2013
    Messages:
    59,935
    Likes Received:
    16,455
    Trophy Points:
    113
    AIs figure out strategy that humans have not figured out. The programmer contribution was to create an AI that can learn new strategy on its own.

    For example, AIs can beat the very best humans at the game of GO. And, there is not anywhere near enough memory to store all possible games or take some other brute force approach, as the very nature of the game precludes that. Furthermore, when humans watch a game played by an AI, the best humans can't figure out what the AI strategy is. AIs have figured out strategies that the best human experts don't understand.

    The AI created strategy by playing itself many millions of times and figuring out how to win.

    This learning ability is one of the scary parts of AI.
     
    bringiton likes this.
  24. Patricio Da Silva

    Patricio Da Silva Well-Known Member Donor

    Joined:
    Apr 26, 2020
    Messages:
    32,008
    Likes Received:
    17,317
    Trophy Points:
    113
    Gender:
    Male
    No, we are not, actually. Unfortunately, there is no scientific device in available technology to prove it.

    I elaborate on this more in depth, here:

    http://politicalforum.com/index.php...t-by-asking-but.612728/page-2#post-1074404600
     
  25. WillReadmore

    WillReadmore Well-Known Member

    Joined:
    Nov 21, 2013
    Messages:
    59,935
    Likes Received:
    16,455
    Trophy Points:
    113
    We need to recognize that our brain is a machine. That machine calculates emotions, knows how to make guesses on incomplete information, has great powers of recognition, knows that it is ephemeral, etc., etc.

    The complexity of the brain IS extreme, of course. So, our ability to duplicate a brain in today's hardware is WAY beyond our technology. But, surely that is a technology question that will gradually be approached in various ways over time. Will the technology include bio matter? Who knows? Bio matter is incredible, so maybe so.

    Postulating that brains are getting help from some other dimension in order to function hits me as similar to postulating that "god did it" when something happens that we don't understand.

    I agree with your last sentence - a progression that will be harder and harder to falsify.

    As for "life", a key aspect of that is capability of procreation. That is way more than the brain and I don't believe it's the issue here.

    As for drugs, our brains are made of biochemistry. It shouldn't be surprising that chemicals can cause various weird results. Frankly, I'm more surprised that the brain can fix itself after such assaults.
     
    Last edited: Aug 30, 2023

Share This Page