A couple of (very basic) questions about computer code

Kyriakos

Creator
Joined
Oct 15, 2003
Messages
78,218
Location
The Dream
I have some very basic questions about the binary code (01 and variations) used in computers. If anyone can help, it would be great for me :)

-Is binary code the only code that computers are based on as they are currently? (ie is there any other code used, now or in the past, and if so which of the different codes is used the most?)

-Why was binary code invented and used in computers?

It goes without saying that i am looking for the most general (which still true) answers to those two questions which are possible. I (possibly) will read more by myself on this. I just would be (potentially) interested in setting the periphery of a short story on a computer and its code, next to its human user.

(I only read a very introductory article, on wiki, about the subject: http://en.wikipedia.org/wiki/Binary_code, since i am not sure if i will end up using it at all in the end).

from the above wiki binary code article said:
Binary numbers were first described in Chandashutram written by Pingala around 300 B.C. Binary Code was first introduced by the English mathematician and philosopher Eugene Paul Curtis during the 17th century.[citation needed] Curtis was trying to find a system that converts logic’s verbal statements into a pure mathematical one. After his ideas were ignored, he came across a classic Chinese text called I Ching or Book of Changes, which used a type of binary code. The book had confirmed his theory that life could be simplified or reduced down to a series of straightforward propositions. He created a system consisting of rows of zeros and ones. During this time period, Curtis had not yet found a use for this system.

Another mathematician and philosopher by the name of George Boole published a paper in 1847 called 'The Mathematical Analysis of Logic' that describes an algebraic system of logic, now known as Boolean algebra. Boole’s system was based on binary, a yes-no, on-off approach that consisted the three most basic operations: AND, OR, and NOT.[1] This system was not put into use until a graduate student from Massachusetts Institute of Technology by the name Claude Shannon noticed that the Boolean algebra he learned was similar to an electric circuit. Shannon wrote his thesis in 1937, which implemented his findings. Shannon's thesis became a starting point for the use of the binary code in practical applications such as computers, electric circuits, and more.[2]

A bit of an edit:

Also, from a purely fictional perspective, would there be any reason to assign one of the two values (0, 1) to the computer, and the other to the human? I repeat that the perspective of this last question is purely about a fictional work. In essense this question is about whether you know of any reason to favor likening computer or human to 0 or 1 :) (and obviously i mean that in the case that the binary keeps its current order, ie in
X AND y= 1 if x=y=1,
X AND y= 0 in any other case).
 
At the most fundamental level (i.e. machine code), all computers are using binary code. The reason is mainly because it makes constructing hardware to store information a lot easier: a transistor is either conductive or not conductive, there either is or isn't voltage, a bit on a hard disk is either magnetized or not magnetized, a bit on a CD is either indented or not indented and so on.

That makes storing and retrieving a simple yes/no operation. If you were using a ternary system, for instance, you would have to distinguish three possible states. This would entail measuring the degree of magnetization of a hard disk, for instance, which requires more precise and therefore expensive measuring devices and is much more prone to error. You would have more expensive and yet less precise hardware.

Computers based on other systems may have been built for experimental purposes, but were never feasible for practical use.

It might interest you that hypothetical quantum computers make use of quantum physics to store the superposition of many possible states instead of just a 0 or 1 in one bit.
 
Great info, thanks a lot... (particularly interesting was the bit about the change in the ternary system- ie 0, 1, 2 and variations).

One final question, in this case, about your last sentence:

You mentioned "hypothetical quantum computers", is there any reason at all to think that they would be, relative to their cost/problems of using etc, only in a very minor way better than the simple binary computers we already have?

I ask because the story, in a way, is about someone perplexed by using just one more function in a system, and seemingly never expanding it at all when adding more than 1, no matter if he adds 10, 100, 1000 and so on. The story is about the idea the endless decimals may at some point be roughly rounded up to a number without any decimals, but this in turn will not stay true for every calculation based on that number (anyway the idea is mostly philosophical and not set in math) :)

Later edit:

Basically, the story is about the (philosophical) idea that numbers which cannot be used practically in a system easily defined, are used with utter ease in an infinitely more complicated number system- ie one which has as its basis a number with infinite digits (or something which is related to that) before its first loop that can realistically be noticed.

In the story, something negative happens, but inversely the results are not cataclysmic, due to the infinitesimal fame of the destructive discovery itself.

The story is, in its core, about the limits between what is conscious and what is not.
 
You might want to look at gates.

Every single computer program can essentially be written using a sequence of logic gates. The logic gates take a binary input and produce a binary output. It's basically what the circuits on your motherboard are built out of.

So I guess to answer one of your questions, computers are built to use binary code, because the hardware is built out of billions of logic gates, and the software eventually gets translated to a bunch of logic gate operations.

The story is about the idea the endless decimals may at some point be roughly rounded up to a number without any decimals

Read up on floating point accuracy issues here.

I don't really fully understand what you're saying, but that might help you.
 
Read up on floating point accuracy issues here.

Hm...from that link you posted:

wikifloatingpoint said:
While floating-point addition and multiplication are both commutative (a + b = b + a and a×b = b×a), they are not necessarily associative. That is, (a + b) + c is not necessarily equal to a + (b + c).

This is quite close to the idea, although not the same. Still very interesting. I do not really wish to approach the story idea from a perspective which would allow for any direct link to the math approach of the concept though (i tried in the past, and now have decided to just leave it be and focus on my 17-year old literary progression and not math which at times interested me heavily but i am not a mathematician).

Basically, the idea in the story is again that a+(b+c) is not the same in a special, but 'progressed' case (or rather a lot of cases) as (a+b)+c is in those, but it is based in a more specific ('first'/'non-progressed') case:

0+(0+0) not being the same as any other sequence of this kind.

In general, the story would have been (essentially) about conscience being "born" of nothing, ie from a computer which was a machine and not conscious. In the story the person made a discovery which seemed to allow that, and in the end had to terminate the machine just as it was starting to become visibly conscious...

(but this kind of story is not really my usual stuff. Up to now i am highly more interested in examining the emotion of horror, and forms of thoughts in relation to it. Also complexity, as in allegories on other systems, maybe even themselves allegories. This new story would be bordering being something math-related and i do not really feel like going this route.). Thanks for the info though :)
 
I would stay away from the math and explaining how exactly the computer becomes conscious. Otherwise those with a math or computer science background will likely find it "silly". I'd make the reason for consciousness vague, and stay away from the binary/floating point accuracy stuff.
 
I would stay away from the math and explaining how exactly the computer becomes conscious. Otherwise those with a math or computer science background will likely find it "silly". I'd make the reason for consciousness vague, and stay away from the binary/floating point accuracy stuff.

Yes, if one leaves enough variables open, enough being related to the type of the problem, then virtually all cases in the identity presented in clear form would still work (in theory) ;)

But, sadly, i am sort of 'easy to focus' (not obsess, but it may lead to that) with such themes, so i would rather not be focused at all with something leading to "pure math", since that would create loads of problems for me.

So if you want to, feel free to pick up the story idea :D
 
At the most fundamental level (i.e. machine code), all computers are using binary code. The reason is mainly because it makes constructing hardware to store information a lot easier: a transistor is either conductive or not conductive, there either is or isn't voltage, a bit on a hard disk is either magnetized or not magnetized, a bit on a CD is either indented or not indented and so on.

May I add that it was actually attempted - by Babbage and et. al - to develop computers based on decimals. That didn't work out so well.
 
There are also analogue computers. The Royal Navy used to use them for making gunnery calculations, I believe.

But that digresses a lot, I'd say.
 
You mentioned "hypothetical quantum computers", is there any reason at all to think that they would be, relative to their cost/problems of using etc, only in a very minor way better than the simple binary computers we already have?

They are not that "hypothetical" anymore. Very small quantum computers have been shown to work.
And yes, they have been shown to work better than a simple binary computer.
The question "how much better" is a hot topic in research and not much is known. Quite a lot of space for imagination. ;)
 
Didn't know that, but you're the physics expert. Did they really already construct a Turing-complete quantum computer? Or just single logic gates?

Kyriakos, I had hoped you'd gone off researching some stuff on your own because I'm not sure I can explain the differences in a both correct and understandable way ;) But the difference between electronic and quantum computers is more than just the scale of speed or something like that. The ability to store a set of superimposed states in one bit of data opens possibilities to realize algorithms that are completely impossible right now. I can give it a try later on.
 
Thanks :) But if you do, it should not be at all on my account, given that (as i posted) i decided not to pursue this story idea.

As for algorithms, they are one of the (many, obviously) parts of math which i know virtually nothing about. The existence of euler's number (e) was one of the reasons i did not even want to examine what they are about. (e raised to the power of iπ) + 1= 0, is the last thing i want to deal with after abandoning my mathematical research these days :D

For what it is worth, that small research which is now dead, was about "i" viewed as some sort of calculation not of only four seperate identities (i, -1, -i, 1) but a few more, up to 11 probably. It had the end of defining a relation between pi and phi (φ=1,61 and countless decimals) through linking both with some sort of expanded notion of "i".

Like i said, though, this study is ended now and i am back to my literary development, which is what i know far better and have been expanding for half my life now (since i was 17). :)

(As for the actual study, i have to suppose it had great faults, given i never got an answer from some people i sent part of it to. At any rate i won't pursue it anymore, and by now do not care even if it had some value. FWIW, finally, it only employed geometry, some algebra, mostly infinite series and the borders they form, and probability theory. The notion of negative roots, which appeared in algebra around 1539 as i read, and the development at 1908 of the idea of "adiabatic accessibility" by Caratheodory, would play some role too, the first one for obvious reasons since this was all about "i" in the first place, the second due to the special kind of hyperbola that idea could be presented by, along with its meaning of a progression which cannot be reversed). The epicenter was the development of the idea of "i". In essense, i guess, it was the same as trying to present an ultimately generalised way to account for virtually all types of infinite series, something which Gauss- who presented them in that form in the first place- famously failed to do. I failed too, i guess, probably failed better in this math game than Gauss would at constructing an interesting short story though :D ).
 
Didn't know that, but you're the physics expert. Did they really already construct a Turing-complete quantum computer? Or just single logic gates?

A quantum computer is not a Turing machine, but as far as I know they have all the logic gates they need. Quantum algorithms like Shor's for the prime number factorization have been demonstrated to work in experiments with <10 Qubits.
 
Huh, that's impressive, I didn't know research has progressed this far. But I assume that's still under experimental conditions?

Also, could you elaborate why a quantum computer is not a Turing machine? I was under the impression that it could be used to simulate one.
 
Huh, that's impressive, I didn't know research has progressed this far. But I assume that's still under experimental conditions?

Yes, you need some sophisticated and expensive technology. Laser cooling and ion traps.

Also, could you elaborate why a quantum computer is not a Turing machine? I was under the impression that it could be used to simulate one.

It can be used as a Turing machine like any normal computer. But if you want a proof that your quantum computer works, you must show that it is more powerful than a Turing machine.
 
Okay, right.
 
It can be used as a Turing machine like any normal computer. But if you want a proof that your quantum computer works, you must show that it is more powerful than a Turing machine.

Does "more powerful than a Turing machine" refer to the fact that a quantum computer would be doing multiple calculations in parallel?
 
A discussion about quantum computers? Without me? I will have to clear up some things here:


And yes, they have been shown to work better than a simple binary computer.

Actually not. It has been shown that for certain problems there exist algorithms that are exponentially better than the best know classical algorithms. However that does not prove that there exists no unknown algorithm that is as good as the quantum algorithm. If I could prove that quantum algorithms are better than any classical algorithm for one problem, I could solve at least one of the unsolved problems in computation complexity theory in 15 seconds.



A quantum computer is not a Turing machine, but as far as I know they have all the logic gates they need. Quantum algorithms like Shor's for the prime number factorization have been demonstrated to work in experiments with <10 Qubits.

For an universal quantum computer capable of implementing any quantum algorithms, you only need two gates and one of those two can be quite trivial for some qubits. There are several different systems in which these gates have been implemented.

What has not been shown so far, but what would be essential to build a useful quantum computer, is quantum error corrections. The algorithms are there and the first steps have been shown, but so far nobody has been able to implement error correction that actually helps.

As far as I know, the record is at 14 qubits at the moment.

Yes, you need some sophisticated and expensive technology. Laser cooling and ion traps.

The equipment is certainly sophisticated and expensive, but you don't necessarily need ion traps or lasers. One of the problems in research on quantum computers is that nobody know in with which system it would actually be possible to build a quantum computer, so which system deserves the most attention. in my opinion, the most promising system so far are superconducting qubits. But hose are getting quite big, so the computer would fill a room and that room would have to be kept at 20mK.

Does "more powerful than a Turing machine" refer to the fact that a quantum computer would be doing multiple calculations in parallel?

A Turing machine could also do multiple calculations in parallel. The advantage of a quantum computer (if it exists, what we believe but cannot conclusively prove) is that a single calculation can have an extremely complex input state. So you are doing one calculation that replaces multiple classical calculations by feeding it a superposition of many possible input states. So you are sort of processing them all at once.
 
A Turing machine could also do multiple calculations in parallel. The advantage of a quantum computer (if it exists, what we believe but cannot conclusively prove) is that a single calculation can have an extremely complex input state. So you are doing one calculation that replaces multiple classical calculations by feeding it a superposition of many possible input states. So you are sort of processing them all at once.
Ok, shamelessly hijacking the thread for a moment :blush:

I recall reading about a DNA based calculator that could be programmed with a gajillion different inputs, but the desired solution to the question would be a specific sequence in the output. Due to recent advances in DNA PCR and such it turns out to be very easy to generate a huge batch of varying input strands and simply dump them into a test tube - shaky shaky, fitler and clean, and then you can examine the resulting "answer" that cluumped to your target answer.

Am I barking mad here, or does that ring a computational bell?
 
Back
Top Bottom