AI Project - Suggestions plz
  • Couldnt find it in the charity shop.
  • What does that mean?

    Enforcement is not in the scope of the app. The brief was only to match people.
  • The brief was to determine a lifelong partner. Always read the docs.
    "Plus he wore shorts like a total cunt" - Bob
  • b0r1s
    Show networks
    Xbox
    b0r1s
    PSN
    ib0r1s
    Steam
    ib0r1s

    Send message
    What does that mean?
    Enforcement is not in the scope of the app. The brief was only to match people.

    Sounds like you need the scrum master to have a stand up huddle about that.
  • The brief was to determine a lifelong partner. Always read the docs.

    Who they should be. Always read the docs.
  • True, but that renders it rather pointless.
    "Plus he wore shorts like a total cunt" - Bob
  • No it doesn't!

    It'll be a boon to the human race.
  • Unlikely wrote:
    Dunno. I went to a biology vs AI lecture and there are common features. Yes they are different, the brain doesn't use convolution and all that but there are striking similarities in the unknowns. When you chain things together things get complicated real quick and it's why the brain and AI are so hard to understand.
    I mean that reads like you're saying "we don't really understand either so ergo plato they're probably very similar".
    Well yes, it does, but we don't understand them for the same reasons - that multiple nodes/neurons connections are insanely complicated and the complexity increases exponentially.  The similaraties are such that they're now using lab grown (human) neurons to build AI models. Pop posted a link about it somewhere. In brain cells there's dendrites that recieve the electric signal (the weights), a cell body thing that processes it (the node value) and an axon that sends the signal to more brain cells. Now I'm not saying they're exactly the same, but you can use these electrical aspects of brain cells to make a dumb ML model. The mathematical complexity of these exponential connections are broadly similar but the brain has something like 70 trillion connections, and it doesn't need need backpropagation or differentiation because it just works out of the box. It's been trained, grown and refined for a few hundred million years. The important bit of real and artificial is still the complexity of electricity moving through a complicated network. It explains why the brain is the most powerful computer in the world, and it is a computer. It's power efficiency is rather staggering compared to AI networks but again that's just evolution.  AI obvs doesn't have sentience but the next Turin Test should be whether you can prove it doesn't have sentience. If it answers exactly like it's sentient, what's the difference? Bit of a ramble but there you go.

    I do kind of get where you're coming from but it's awfully reductive to reduce the brain to its electrical properties.  Yes, you could use neurons that way but you're never going to get close to the way the brain actually functions.
  • Skerret
    Show networks
    Facebook
    die
    Twitter
    @CustomCosy
    Xbox
    Skerret
    PSN
    Skerret
    Steam
    Skerret
    Wii
    get tae

    Send message
    Unlikely wrote:
    Unlikely wrote:
    Dunno. I went to a biology vs AI lecture and there are common features. Yes they are different, the brain doesn't use convolution and all that but there are striking similarities in the unknowns. When you chain things together things get complicated real quick and it's why the brain and AI are so hard to understand.
    I mean that reads like you're saying "we don't really understand either so ergo plato they're probably very similar".
    Well yes, it does, but we don't understand them for the same reasons - that multiple nodes/neurons connections are insanely complicated and the complexity increases exponentially.  The similaraties are such that they're now using lab grown (human) neurons to build AI models. Pop posted a link about it somewhere. In brain cells there's dendrites that recieve the electric signal (the weights), a cell body thing that processes it (the node value) and an axon that sends the signal to more brain cells. Now I'm not saying they're exactly the same, but you can use these electrical aspects of brain cells to make a dumb ML model. The mathematical complexity of these exponential connections are broadly similar but the brain has something like 70 trillion connections, and it doesn't need need backpropagation or differentiation because it just works out of the box. It's been trained, grown and refined for a few hundred million years. The important bit of real and artificial is still the complexity of electricity moving through a complicated network. It explains why the brain is the most powerful computer in the world, and it is a computer. It's power efficiency is rather staggering compared to AI networks but again that's just evolution.  AI obvs doesn't have sentience but the next Turin Test should be whether you can prove it doesn't have sentience. If it answers exactly like it's sentient, what's the difference? Bit of a ramble but there you go.
    I do kind of get where you're coming from but it's awfully reductive to reduce the brain to its electrical properties.  Yes, you could use neurons that way but you're never going to get close to the way the brain actually functions.
    we'd definitely get close to the way some brains function. Maybe brains of people on this very forum.
    Skerret's posting is ok to trip balls to and read just to experience the ambience but don't expect any content.
    "I'm jealous of sucking major dick!"~ Kernowgaz
  • Related; I was listening to Joscha Bach talk about brain/computer + human vs machine intelligence the other day. Here's a bit of transcript from this -
    ... I guess that a better metaphor for what the brain is doing is not circuitry, but it's something like an ether through which activation waves are propagating. And the medium of the propagation are neurons and adjacent cell types that are taking the signal and reaching it forward to other cells while modulating it.

    And all the computations are taking place in these activation fronts and these activation fronts for that thing to work need to be periodic. So there's basically cyclical waves that are passing through the neural substrate and producing behaviour, and this spreading of these activation fronts is so slow that it is roughly at the speed of sound, so phonons is not a bad metaphor.

    And very often the neurons are not deterministic which means that given the same environmental configuration the neuron is not going to go into a single particular state but one out of multiple states, because they're not completely deterministic. And this means that if you want to guarantee getting a particular kind of result from this you need a bunch of neurons, so statistically often most of them are going to get into that state. But what the others are going to do is that they sample the space of functions that is adjacent to the result that you want. And this gives you sometimes more power because instead of having to train your neurons to perform one function only you can constrain them to compute a bunch of functions simultaneously, and voting on the outcome of the results.

    It's a slightly different paradigm in thinking about how this computation works in the brain compared to our digital computers, which I think is responsible for the fact that our brains are so efficient despite being so abysmally slow and unreliable. If you look at graphics cards they have so much larger memory than our brains and are so much faster, why are they so less efficient than what our brain is?

    Despite this he doesn't think that we would need new neuromorphic hardware to achieve brain-like computation, this kind of stuff can be emulated using our current deterministic architectures. Also you can get to human-like intelligence using different methods and current hardware, the most interesting and impressive results so far from current models is at least a partial demonstration of this. It's more about understanding and replicating the functions rather than the substrate.

    To create a being with very human-like creativity, imagination, fallibility etc, probably will require a kind of architecture that works like the above description or at least something of an approximation. Thoughts materialising out of an ether seems like an apt metaphor.

    Also worth remembering that human level intelligence isn't going to be the pinnacle, and our squishy monkey brains won't necessarily be the optimal solution.

Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!