the-podcast guy recently linked this essay, its old, but i don’t think its significantly wrong (despite gpt evangelists) also read weizenbaum, libs, for the other side of the coin

  • Tomorrow_Farewell [any, they/them]@hexbear.net
    link
    fedilink
    English
    arrow-up
    6
    ·
    2 months ago

    If I asked you what is 300th word of the poem, you cannot do it. Computer can

    I’m sorry, but this is a silly argument. Somebody might very well be able to tell you what the 300th word of a poem is, while a computer that stored that poem as a .bmp file wouldn’t be able to (without tools other than just basic stuff that allows it to show you .bmp images). In different contexts we remember different things about stuff.

    • plinky [he/him]@hexbear.netOP
      link
      fedilink
      English
      arrow-up
      3
      ·
      2 months ago

      Generally you can’t though. of course there are people who remember in different ways, or who can remember pi number to untold digits. Doesn’t mean there are tiny bytes-like engravings in their brain, or that they could remember it perfectly some time from now. Computer can tell what is 300 pixel of that image, while you don’t even have pixels, or fixed visual memory shape. Maybe it’s wide shot of nature, or maybe it’s a reflection of the light in the eyes of your loved one

      • Frank [he/him, he/him]@hexbear.net
        link
        fedilink
        English
        arrow-up
        6
        ·
        edit-2
        2 months ago

        People don’t think that brains are silicon chips running code through logic gates. At least, the vast majority of people don’t.

        The point we’re making here is that both computers and human minds follower a process to arrive at a give conclusion, or count to 300, or determine where the 300th pixel is in a computer. A computer doesn’t do that magically. There’s a program that runs that counts to 300. A human would have to dig out a magnifying glass and count to three hundred. The details are different, but both are counting to 300.

        • plinky [he/him]@hexbear.netOP
          link
          fedilink
          English
          arrow-up
          3
          ·
          2 months ago

          because that’s a task for computer, my second example: giving you two words, it would be slower for computer than arriving at 300 th word, while for you it would be significantly faster than counting.

          fundamentally a question is brain a turing machine? I rather think not, but it could be simulated as such with some untold complexity.

          • bumpusoot [none/use name]@hexbear.net
            link
            fedilink
            English
            arrow-up
            4
            ·
            edit-2
            2 months ago

            Firstly, I want to say it’s cool you’re positively engaging and stimulating a lot of conversation around this.

            As far turing machines go - It’s only a concept that’s meant to show a fundamental “level” of computing (“turing completeness”), what a computing device can or cannot achieve. As you agree a turing machine could ‘simulate’ a brain (and we know brains can simulate a turing machine - we invented them!), then conceptually, yes, the brain is computationally equivalent, it is ‘turing complete’, albeit with some randomness thrown in.

            • Frank [he/him, he/him]@hexbear.net
              link
              fedilink
              English
              arrow-up
              2
              ·
              2 months ago

              some randomness thrown in.

              I remain extremely mad at the Quantum jerks for demonstrating that the universe is almost certainly not deterministic. I refuse to be cool about it.

            • plinky [he/him]@hexbear.netOP
              link
              fedilink
              English
              arrow-up
              1
              ·
              2 months ago

              We can simulate a water molecule, does it make a turing machine then? Is single protein? A whole cell? 1000 cells in some invertebrate?

              Simulation doesn’t work backwards, it’s not an implied equivalency of turing completeness for both directions. If brain is a turing machine we can map one to one it’s whole function to any existing turing machine, not simulate it with some degree of accuracy.

          • Tomorrow_Farewell [any, they/them]@hexbear.net
            link
            fedilink
            English
            arrow-up
            4
            ·
            2 months ago

            because that’s a task for computer, my second example: giving you two words, it would be slower for computer than arriving at 300 th word, while for you it would be significantly faster than counting

            If your thesis is that human brains do not work perfectly the same way, and not that the analogy with computers in general is wrong, then sure, but nobody disagrees with that thesis, then. I don’t think that any adult alive has proposed that a human brain is just a conventional binary computer.

            However, this argument fails when it comes to the thesis of analogy with computers in general. Not sure how it is even supposed to be addressing it.

            fundamentally a question is brain a turing machine? I rather think not

            Well, firstly, a Turing machine is an idea, and not an actual device or a type of device.
            Secondly, if your thesis is that Turing machine does not model how brains work, then what’s your argument here?

            • plinky [he/him]@hexbear.netOP
              link
              fedilink
              English
              arrow-up
              1
              ·
              2 months ago

              of course I can’t prove that brain is not a turing machine, I would be world famous if I could. Computers are turing machines yes? They cannot do non-Turing machines operations (decisions or whatever that’s called)

              What comparing computer with brain gives to science, I’m asking again for third time in this thread. What insight it provides, aside from mechanizing us to the world? That short term memory exists? a stone age child could tell you that. That information goes from the eyes as bits like a camera? That’s already significantly wrong. That you recall like a photograph read out from your computer? Also very likely wrong

              • Tomorrow_Farewell [any, they/them]@hexbear.net
                link
                fedilink
                English
                arrow-up
                6
                ·
                edit-2
                2 months ago

                of course I can’t prove that brain is not a turing machine, I would be world famous if I could

                Okay, so, what is your basis for thinking that, for example, if a brain was given some set of rules such as ‘if you are given the symbol “A”, think of number 1 and go to the next symbol’ and ‘if you are given the symbol “B” and are thinking of number 1, think of number 2 and go back by two symbols’ and some sequence of symbols, that that brain wouldn’t be capable of working with those rules?

                Computers are turing machines yes?

                As in, they are modelled by Turing machines sufficiently well in some sense? Sure.

                They cannot do non-Turing machines operations (decisions or whatever that’s called)

                What? What are ‘non-Turing machines operations’? The term ‘Turing machine’ refers to generalisations of finite automata. In this context, what they are doing is receiving input and reacting to it depending on their current state. I can provide some examples of finite automata implementations in Python code, if you want me to.
                The word ‘decision’ doesn’t carry any meaning in this context.

                What comparing computer with brain gives to science, I’m asking again for third time in this thread

                I don’t recall you asking this question before, and I do not have an answer. I also don’t see the question as relevant to the exchange so far.

                That information goes from the eyes as bits like a camera? That’s already significantly wrong

                A bit is a unit of information. If we treat the signal that the eyes send to the brain as carrying any sort of information, you can’t argue that the brain doesn’t (EDIT: I initially forgot to include the word 'doesn’t) receive the information in bits. If you claim otherwise, you don’t understand what information is and/or what bits are.

                That you recall like a photograph read out from your computer? Also very likely wrong

                Nobody is claiming, however, that your brain pulls up an analogue of a .bmp when you recall an image. You likely remember some details of an image, and ‘subconsciously’ reconstruct the ‘gaps’. Computers can handle such tasks just fine, as well.

              • Frank [he/him, he/him]@hexbear.net
                link
                fedilink
                English
                arrow-up
                2
                ·
                2 months ago

                That information goes from the eyes as bits like a camera?

                Information goes in to the optic nerve as electrical signals, which is why we can glue wires to the optic nerve and use a camera to send visual information to the brain. I think wek?e been able to do that for twenty years. We just need a computer to change the bits from the camera in to the correct electric impulses.

                • plinky [he/him]@hexbear.netOP
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  2 months ago

                  I didn’t use words optic nerve by chance, you can find out it’s retina excitation, not the nerve itself. Because the eye already does pre processing. But that reminded me that here actually informational understanding helped cause they couldn’t understand how could it send that much data (turn out it doesn’t). So one win for informational theory by showing something couldn’t be 🥰

      • Tomorrow_Farewell [any, they/them]@hexbear.net
        link
        fedilink
        English
        arrow-up
        1
        ·
        2 months ago

        Again, though, this simply works to reinforce the computer analogy, considering stuff like file formats. You also have to concede that a conventional computer that stores the poem as a .bmp file isn’t going to tell you what the 300th word in it is (again, without tools like text recognition), just like a human is generally not going to be able to (in the sort of timespan that you have in mind that is - it’s perfectly possible and probable for a person who has memorised the poem to tell what the 300th word is, it would just take a bit of time).

        Again, we can also remember different things about different objects, just like conventional computers can store files of different formats.
        A software engineer might see something like ‘O(x)’ and immediately think ‘oh, this is likely a fast algorithm’, remembering the connection between time complexity of algorithms with big-O notation. Meanwhile, what immediately comes to mind for me is ‘what filter base are we talking about?’, as I am going to remember that classes of finally relatively bounded functions differ between filter bases. Or, a better example, we can have two people that have played, say, Starcraft. One of them might tell you that some building costs this amount of resources, while the other one won’t be able to tell you that, but will be able to tell you that they usually get to afford it by such-and-such point in time.

        Also, if you are going to point out that a computer can’t tell if a particular image is of a ‘wide shot of nature’ or of a ‘reflection of the light in the eyes of one’s loved one’, you will have to contend with the fact that image recognition software exists and it can, in fact, be trained to tell such things in a lot of cases, while many people are going to have issues with telling you relevant information. In particular, a person with severe face blindness might not be able to tell you what person a particular image is supposed to depict.

        • plinky [he/him]@hexbear.netOP
          link
          fedilink
          English
          arrow-up
          2
          ·
          2 months ago

          I’m talking about visual memory what you see when you recall it, not about image recognition. Computers could recognize faces 30 years ago.

          I’m suggesting that it’s not linked lists, or images or sounds or bytes in some way, but rather closer to persistent hallucinations of self referential neural networks upon specified input (whether cognitive or otherwise), which also mutate in place by themselves and by recall but yet not completely wildly, and it’s rather far away picture from memory as in engraving on stone tablet/leather/magnetic tape/optical storage/logical gates in ram. Memory is like a growing tree or an old house is not exactly most helpful metaphor, but probably closer to what it does than a linked list

          • Tomorrow_Farewell [any, they/them]@hexbear.net
            link
            fedilink
            English
            arrow-up
            2
            ·
            2 months ago

            I’m talking about visual memory what you see when you recall it, not about image recognition

            What is ‘visual memory’, then?
            Also, on what grounds are you going to claim that a computer can’t have ‘visual memory’?
            And why is image recognition suddenly irrelevant here?

            So far, this seems rather arbitrary.
            Also, people usually do not keep a memory of an image of a poem if they are memorising it, as far as I can tell, so this pivot to ‘visual memory’ seems irrelevant to what you were saying previously.

            I’m suggesting that it’s not linked lists, or images or sounds or bytes in some way, but rather closer to persistent hallucinations of self referential neural networks upon specified input

            So, what’s the difference?

            which also mutate in place by themselves and by recall but yet not completely wildly

            And? I can just as well point out the fact that hard drives and SSDs do suffer from memory corruption with time, and there is also the fact that a computer can be designed in a way that its memory gets changed every time it is accessed. Now what?

            Memory is like a growing tree or an old house is not exactly most helpful metaphor, but probably closer to what it does than a linked list

            Things that are literally called ‘biological computers’ are a thing. While not all of them feature ability to ‘grow’ memory, it should be pretty clear that computers have this capability.

            • plinky [he/him]@hexbear.netOP
              link
              fedilink
              English
              arrow-up
              1
              ·
              2 months ago

              What is visual memory indeed in informational analogy, do tell me? Does it have consistent or persistent size, shape or anything resembling bmp file?

              The difference is neural networks are bolted on structures, not information.

              • Tomorrow_Farewell [any, they/them]@hexbear.net
                link
                fedilink
                English
                arrow-up
                3
                ·
                2 months ago

                What is visual memory indeed in informational analogy, do tell me?

                It’s not considered as some special type of memory in this context. Unless you have a case for the opposite, this stuff is irrelevant.

                Does it have consistent or persistent size, shape or anything resembling bmp file?

                Depends on a particular analogy.
                In any case, this question seems irrelevant and rather silly. Is the force of a gravitational pull in models of Newtonian physics constant, does it have a shape, is it a real number, or a vector in R^2, or a vector in R^3, or a vector in R^4, or some other sort of tensor? Obviously, that depends on the relevant context regarding those models.

                Also, in what sense would a memory have a ‘shape’ in any relevant analogy?

                The difference is neural networks are bolted on structures, not information

                Obviously, this sentence makes no sense if it is considered literally. So, you have to explain what you mean by that.