To take part in discussions on talkSFU, please apply for membership (SFU email id required).

Random thought about computers

edited August 2007 in General
I don't know what class teaches stuff like this, but did you ever notice the similarities between the human brain and computers?? The parts are all there, the mechanism appears to be similar, and it's dynamic. The biggest/most glaring difference would be that computers are machines whereas the brain is organic matter. Brains and computers both "process" things...whats a process...a connection of random electrons/peices of information/ideas. So the secret to improving computers would be to further study the brain and see which advantages it has. Therefore, I don't think artificial intelligence would be a software program as we invision it now, but an integrated set of processors, with fluid (water?) communication among them keeping in mind heat/size (thermal) restrictions. Anyone following me?

Comments

  • edited July 2007
    I often thing about computer too! I think about the limitations of binary code. All those '1's and '0's, all those 'on' and 'off' processes. My friends always laugh at me, and some time strangers too when I bring up the idea of an anolog computer. A computer where the system is not based on '1's and '0's.

    See my reasoning behind this thought is that no matter how man '1's and '0's you put in a sequence there will always be 1 more digit worth of refining you can do. For example,

    lets pretend that '1' is black and '0' is white. So this means that 11 would be all black and 00 would be all white. Further 01 would be grey as well as 10. Ok easy, now we need shades of grey 110, 11111110, 000000000000101 blah blah blah, you get the point, you can seemingly create an unlimited spectrum of grey. However, no matter how many '1's and '0's you put into a sequence you could always refine the shade further by adding more digits. So the aguement is is that you cannot be fully acurate (at least in terms of shades of grey). I am sure this has many other uses too, for instance calculating numbers with repeating decimal places.


    My proposal is is to develope a computer that can fill in the gaps between the '1's and '0's in order to create a mechine that can determine the results to questions that a conventional computer could not, a highy accurate mechine that can hypathetically calculate between numbers.

    I am no computer scientist, i am merely a geographer/historian, but my roommate is a computer scientist. He laughs at me hard when I explain this to him. So, laugh away I am used to it. Just tell me why this is a dumb idea or why it will never work.
  • edited July 2007
    Actually that is a very impressive theory Konrad...I read once that someone was trying to develop the next version of binary code which would not only have an "on" and "off" position, but a "half-way" position which would be determined randomly. Would this allow the computer to be creative? I don't know, but it definitely could add a whole new dimension to how information is processed.
  • edited July 2007
    The reason for the 1's and 0's is because in that manner, you can determine the amount of space (bits) a given amount of information takes up. True, it doesn't allow for an unlimited amount of colours, but remember that if you wanted unlimited amounts of colours, you would need an unlimited amount of space to hold that information, and it would be impossible to encode/decode on the fly.

    With colour (using the same example), the reason for the finite amount of colours a computer can display is because after a certain number of colours (32-bit "true colour") the human eye cannot distinguish the colours in between, so there's no point in making the detail any higher. What I'm saying is that typically we don't need computers to be 100% accurate, because 1, it would take a lot of processing power, and 2, you end up with a lot of extraneous information. This just makes me think of a calculator. You're only given up to a certain number of digits after the decimal. On some calculators, you can go back pretty far, but eventually, for the purposes of calculations, the 100th digit after the decimal (and thereafter) become irrelevant and negligible. Why waste time with it?

    As for having an analog computer, in terms of efficiency, I think it wouldn't be practical at all, although I don't think I understand what you mean by a completely analog computer.
  • edited July 2007
    An analog computer would be able to handle a continuum of voltages instead of only discrete voltages. I'm not sure how useful they would be, but for reasons of at least theoretical curiosity they've been explored somewhat.
  • edited August 2007
    As a computing science student I have argued this topic many a times and once again I will quickly show my side of the arguement. A computer will never be able to be the same as a human brain. A computer could replicate the actions of a human by having it programmed to react in a certain way for every type of situation that it is in. However, because it is limited to being series of switches, a computer has no intentionality so that when a computer does a certain action, it does it because it is programmed to react that way and not like a human who intends for the results of that action to happen.

Leave a Comment