Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I think what he means is that the brain is multi-threaded with optimistic locking and hardcoded timing constants.

Try running old MS-DOS games inside an emulator. Many of them will act quite funny when you turn up the clock-speed.



That's not exactly turning up the speed, that's emulating a faster CPU.

Try taking a game boy emulator and putting it on fast forward. Works perfectly. If you have full emulation of a system you can go arbitrarily fast.


Yes, it was a flawed analogy (aren't they all...).

However, these DOS-games usually fail in sped up emulations because they make assumptions about external inputs such as the Real Time Clock.

I think the point of OP was that we can't reasonably speed up the "RTC" in a brain emulation if you want it to interact with the real world, because that would break all sorts of hardwired assumptions.

For a simple example, if you ran your brain at 4x speed then it would perceive everything in super-slow-motion. At that speed it would already have difficulties to understand when you speak to it (at the least it will have to be a very patient brain).

At higher speeds pretty much all cognitive functions would probably break down - unless you feed it recorded inputs that have been accelerated to match the brain-speed.


That is an issue, but there is a huge difference between having to slow the brain back down in certain situations and being unable to speed it up at all.

I don't expect to play an entire game on fast forward, after all.

Edit: new line about breakdown: I just assumed that whatever input was given would be sped up too. That part of the project seems far less complicated than the brain simulation itself.


My knowledge of brains is limited, but I'd think the issue remains the same even if you cut off all external inputs.

Basics like memory decay are also tied to the system clock. So if you run your brain at 1000x speed then it would probably simply forget everything almost immediately.

And if you make a "simple" patch that prevents it from ever forgetting anything then it would be overwhelmed because it is only wired to deal with a certain amount of memories at a time.

In terms of the DOS-Game analogy: We may be able to patch a game that originally ran in 256kb of Ram to run in 2GB and actually fill that up (because we disabled the garbage collector). But the game probably uses algorithms that break down when faced with such a large dataset.

At this point we're down to having to actually understand the game (or brain) in detail, in order to make the changes required for running at higher capacity.


Actually having a higher capacity will be tricky, yes. But at least there won't be cell decay in the scientists working 4000 hour weeks to figure it out.


I agree, but you could always simulate the brain's environment as well. Then you could speed up the environment with the brain. Bridging the gap might be annoying, for the brains, waiting to communicate with the glacial pace of the squishy rubbish real world humans, but I'm sure they'd get over it.


Although it is also a very good point that even with modern hardware many orders of magnitude faster than the emulated hardware, most emulators have to resort to timing hacks to make everything run smoothly, and because of timing inconsistencies when locking, parallelism is of limited use even when emulating multiple hardware components that originally ran in parallel. To emulate a human brain, we're probably either going to need far more sensitive locking across multiple cores than is currently even imagined, or we're going to have to emulate the whole massively parallel thing on a single thread on a CPU much, much more powerful than a brain.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: