• 0 Posts
  • 17 Comments
Joined 17 days ago
cake
Cake day: July 27th, 2025

help-circle
  • Intelligence and consciousness are not related in the way you seem to think.

    We’ve always known that you can have consciousness without a high level of intelligence (think of children, people with certain types of brain damage), and now for the first time, LLMs show us that you can have intelligence without consciousness.

    It’s naive to think that as we continue to develop intelligent machines, suddenly one of them will become conscious once it reaches a particular level of intelligence. Did you suddenly become conscious once you hit the age of 14 or whatever and had finally developed a deep enough understanding of trigonometry or a solid enough grasp of the works of Mark Twain? No of course not, you became conscious at a very early age, when even a basic computer program could outsmart you, and you developed intelligence quite independently.


  • I’m going to repeat myself as your last paragraph seems to indicate you missed it: I’m *not* of the view that LLMs are capable of AGI, and I think it’s clear to every objective observer with an interest that no LLM has yet reached AGI. All I said is that like cats and rabbits and lizards and birds, LLMs do exhibit some degree of intelligence.

    I have been enjoying talking with you, as it’s actually quite refreshing to discuss this with someone who doesn’t confuse consciousness and intelligence, as they are clearly not related. One of the things that LLMs do give us, for the first time, is a system which has intelligence - it has some kind of model of the universe, however primitive, to which it can apply logical rules, yet clearly it has zero consciousness.

    You are making some big assumptions though - in particular, when you said an AGI would “have a subjective sense of self” as soon as it can “move, learn, predict, and update”. That’s a huge leap, and it feels a bit to me like you are close to making that schoolboy error of mixing up intelligence and consciousness.



  • I think current LLMs are already intelligent. I’d also say cats, mice, fish, birds are intelligent - to varying degrees of course.

    I’d like to see examples of LLMs paired with sensorimotor systems, if you know of any

    If you’re referring to my comment about hobbyist projects, I was just thinking of the sorts of things you’ll find on a search of sites like YouTube, perhaps this one is a good example (but I haven’t watched it as I’m avoiding YouTube). I don’t know if anyone has tried to incorporate a “learning to walk” type of stage into LLM training, but my point is that it would be perfectly possible, if there were reason to think it would give the LLM an edge.

    The matter of how intelligent humans are is another question, and relevant because AFAIK when people talk about AGI now, they’re talking about an AI that can do better on average than a typical human at any arbitrary task. It’s not a particularly high bar, we’re not talking about super-intelligence I don’t think.


  • thanks for this very yummy response. I’m having to read up about the technicalities you’re touching on so bear with me!

    According to wiki, the neocortex is only present in mammals but as I’m sure you’re aware mammals are not the only creatures to exhibit intelligence. Are you arguing that only mammals are capable of “general intelligence”? I can get on board with what you’re saying as *one way* to develop AGI - work out how brains do it and then copy that - but I don’t think it’s a given that that is the *only* way to AGI, even if we were to agree that only animals with a neocortex can have “general intelligence”. Hence the fact that a given class of machine architecture does not replicate a neocortex would not in my mind make that architecture incapable of ever achieving AGI.

    As for your point about the importance of sensorimotor integration, I don’t see that being problematic for any kind of modern computer software - we can easily hook up any number of sensors to a computer, and likewise we can hook the computer up to electric motors, servos and so on. We could easily “install” an LLM inside a robot and allow it to control the robot’s movement based on the sensor data. Hobbyists have done this already, many times, and it would not be hard to add a sensorimotor stage to an LLM’s training.

    I do like what you’re saying and find it interesting and thought-provoking. It’s just that what you’ve said hasn’t convinced me that LLMs are incapable of ever achieving AGI for those reasons. I’m not of the view that LLMs *are* capable of AGI though, it’s more like something that I don’t personally feel well enough informed upon to have a firm view. It does seem unlikely to me that we’ve currently reached the limits of what LLMs are capable of, but who knows.



  • That’s good, thanks for that info. It does say it’s a “good will” gesture from the various train companies, but it seems to be a firm offer - would be good to have it enshrined in law nevertheless.

    From what I read, you need to get a Delay/Cancellation Confirmation somehow. You then would need to book a seat on one of the following trains - on some routes, most trains are full by the time of travel, and in the event of a delay like this, you’re probably not the only one needing to reschedule. You may well need to get a hotel overnight and travel the next day. All possible, but potentially stressful.

    I will still avoid getting long distance trains with connections - but saying that, none of the several international train journeys I’ve made so far have been delayed more than 15 minutes.

    Missing a connecting flight is no less stressful - and I also always avoid those as much as possible, although because long-distance flights are shorter, the knock-on effects of missing a connection tend to be less (more likely to be able to find a seat on a replacement flight on the same day for instance).







  • Archive link - https://archive.ph/6rgTS

    Definitely good news to hear. Over the last couple of years, I’ve made several long journeys by train that normally would have been by plane.

    One rule of thumb I’ve been using is to only get direct trains. I find travel a bit stressful in any case and I don’t want to be worrying about any delay with the first train causing me to miss the second. If the two trains are the same operator, you would at least be able to reschedule onto a later train - as long as there is one with free space which isn’t guaranteed at zero notice. If the trains are run by different operators then it’s presumably down to whatever agreements exist between them. So I skip all that uncertainty by only doing it when there’s a single direct train available. Hopefully the EU will come up with a good solution for this as well.