A couple of (very basic) questions about computer code

I am not speaking of cups and doughnuts though, or even (as i stressed out already) about a binary base system developed by us, and a quaternary base system developed by us. The difference here is that the computer will be working with dna, not some system simulating dna. Do you think both are the same? I heavily doubt they are the same at all, given that dna works by itself not just because someone planted a quaternary system in it, but because it is itself the quaternary system it uses. The computer is not its own developer or user. We are not the computer. This is why i claimed this may not work well.

DNA is not an ideal quaternary system, you can have deviations from that ideal behavior. A DNA strand could break for example, rendering the information useless. But if you want to use it to do computations in a quaternary system, you should better try and make it behave as close to the ideal system as possible.

The system, whether binary, quaternary, quantum binary, quantum quaternary or whatever is always a model we try to emulate with the underlying system, no matter whether that is a semiconductor, DNA, atom, photon or anything else. The physical system matters only so far as there might be deviations from the ideal model that have not been addressed properly by the constructors of the system


Are these 'gates' that take in 4 inputs as powerful as binary gates though, in the sense that you can build any function you can think of with them?

Cause that's another good thing about binary gates - you can build anything using just 1 type of gate - a NAND gate. That helps reduce costs because you essentially have everything built out of 1 type of building block.

It's easy to prove, that you can build any function you want with a 4 input gate: Use a gate that ignores the two extra inputs and thus emulates a binary gate. Actually using 4 input gates that way would not be very efficient, but as you add true 4 input gates, the functionality can only grow, not decrease.

And while you can build any function from NAND gates, actually doing so when designing a chip would be quite wasteful. If you need a NOR gate at one point, you could either use 4 NAND gates (requiring 16 transistors in CMOS logic) or just implement the NOR gate directly (only 4 transistors). The amount of transistors on a chip is limited and every additional transistor in a function will increase the power consumption of that function. So for advanced chips you do not have the luxury of building everything with NAND gates.
 
All this has got ideas in my head churning, but I can't help it but cling to the idea that designing traditional computing machines using binary gates is incredibly efficient and would be hard to beat.

Mind you I had to sit through theoretical computer science courses and learn how the whole computer works from gates to operating systems to compilers to what have you.. So the whole binary architecture, or whatever you want to call it, has been sort of pounded into my head.
 
@Uppi: So the question becomes if the "deviations" from our own concept of a quaternary system in relation to the actual system Dna works as (and with), are as crucial in regards to how Dna works that trying to use it as part of a computer will ultimately have less than ideal results (moreso from a practical viewpoint, in relation to cost/efficiency of existent, binary based computers).

I had assumed there would be major "deviations", not because i have studied dna (only what i know from highschool biology, which is little at any rate) but due to what i noted: dna is itself the system it uses, and to us it may appear similar to our own concept of a quaternary system, but this is not a safe-enough bet to make one assume dna-reliant computers would be more efficient as things now stand.
 
All this has got ideas in my head churning, but I can't help it but cling to the idea that designing traditional computing machines using binary gates is incredibly efficient and would be hard to beat.

Bolded the operative word.

Read that wiki I linked about Radix Economy - it mentioned something about how ternary architecture is very well suited to database queries. So here's an example of a situation where a custom built chip using 3-gate transistors (whatever the technical term is) would be more effective than programming the traditional binary transistors to emulate base3 logic. Imma gonna guess the NSA has looked into this. :scan:

Slightly off topic:
I came across a website that walks you through building a browser-based computer 1 virtual transistor at a time, all from NAND gates. I'm on my mobile, so I'm not going to look for it just now. But when my life settles down a bit I plan on pursuing it.
 
All this has got ideas in my head churning, but I can't help it but cling to the idea that designing traditional computing machines using binary gates is incredibly efficient and would be hard to beat.

Mind you I had to sit through theoretical computer science courses and learn how the whole computer works from gates to operating systems to compilers to what have you.. So the whole binary architecture, or whatever you want to call it, has been sort of pounded into my head.

It is extremely hard to beat the binary architechture implemented in silicon, but less for fundamental reasons but for technological reasons and innovation inertia. Binary gates are easy and cheap in silicon and so much effort has been put into optimising binary gates and binary informatiion processing that any alternative approach would have a very hard time to even get comparable results.

Silicon itself is an instructive example: It is only an average semiconductor for building electronics and there are several other semiconductors with better properties that would allow for faster computers. But so much effort has been put into the silicon process that anz alternative approach is far behind. Yet, most of the development is still made with silicon, because that is where the money is. With other technologies there is not much profit until they catch up, so there is not much development going on and most of them will most likely never catch up.



@Uppi: So the question becomes if the "deviations" from our own concept of a quaternary system in relation to the actual system Dna works as (and with), are as crucial in regards to how Dna works that trying to use it as part of a computer will ultimately have less than ideal results (moreso from a practical viewpoint, in relation to cost/efficiency of existent, binary based computers).

I had assumed there would be major "deviations", not because i have studied dna (only what i know from highschool biology, which is little at any rate) but due to what i noted: dna is itself the system it uses, and to us it may appear similar to our own concept of a quaternary system, but this is not a safe-enough bet to make one assume dna-reliant computers would be more efficient as things now stand.

I am not familiar with the challenges that DNA bases computing would have to overcome. But I would think that there are no fundamental reasons why it cannot work. As with other technologies, there is a huge gap to the binary in silicon technology and I have no idea if DNA based computing will ever be as efficient as conventional computers.
 
Found it:
http://blog.kevtris.org/?p=62

computer guy who has far too much time on his hands said:
Everything on the design is made out of NAND gates, even the 7 segment decoding. The last PCB though has a few non-NAND gate chips like an NES PPU and a serial chip and stuff, but it’s just a peripheral board and is not part of the NANDputer proper. (Eventually I want to make a NAND UART and replace that peripheral board).

The basic architecture of the computer is actually fairly conventional. There’s an accumulator, instruction skipping (like on PIC) for decision making, a full ALU (and, add, or, xor, subtract, add with carry, subtract with borrow, set all bits, clear all bits, shifting), 8 bit registers, separate RAM/ROM areas (harvard arch), and bit set/clearing. There’s a 3 level stack, and even an interrupt!

While the CPU architecture is fairly conventional, the way it is implemented isn’t. I went with a bit-serial setup on here to save gates. The ALU for example is only 1 bit, with a “latching” carry so operations are performed a bit at a time on the 8 bit registers/memory. The program counter is also bit-serial, and on the first youtube video you can see the carry propagating during the incrementing of it.

The downside of course is that this is much slower than a parallel architecture, but this way takes vastly fewer gates. It takes 96 clock cycles to run a single instruction: There’s 16 “T” states and 3 non-overlapping clocks generated using a 6 stage johnson counter with some NAND decoding. (The flipflops that form the johnson counter are made from NANDs too). Thus, it’s 16*9 or 96 cycles per instruction. The clock runs at 10MHz, so this is a bit over 100KIPs (thousands of instructions per second). This sounds really slow but it isn’t TOO slow. It’s faster than a TMS1000, and it’s only 2-3x slower than a Commodore 64 which I estimate at 250-300kips when it runs at 1MHz (3 and 4 cycle instructions being some of the more common ones).

If you don't believe me that this guys is a little kicked, just read his blog post about his new logic analyzer:
http://blog.kevtris.org/?p=81
Since I am only going to be reverse engineering things like videogame systems with it, this should be more than plenty. After seeing what’s inside I feel kinda guilty too. This will be like swatting a fly with an ICBM...

I really liked the copper thieving on this board. This is the little rectangles on the board. These are used to equalize copper usage so that when the board is made it is less likely to warp or delaminate during soldering, because the copper area vs. etched areas are equalized. The other interesting thing is there are spark gaps built onto the board, too!..

it’s got a metric assload of SOIC RAM chips on here… 34 to be exact. There’s two HP ASICs with heatsinks, an Actel FPGA and four HP level comparator chips on the inputs to detect logic levels. The little QFP near the bottom middle is an octal voltage DAC, doubtlessly being used to set the threshold voltage. I looked it up and that’s a $50 part...

Last up is the power supply. This supply is an absolute thing of beauty. It’s also an insane power beast capable of up to 700W...
 
Thanks for the bump. I missed this thread the first time around. To borrow Uppi's comment, a discussion about computers? Without me?

Ternary computers have existed for quite some time now. The one below was constructed in the Soviet Union in 1858. Donald Knuth still thinks that they may eventually supplant binary ones.

%D0%A1%D0%BD%D0%B8%D0%BC%D0%BE%D0%BA_%D0%A1%D0%B5%D1%82%D1%83%D0%BD%D1%8C.JPG
 
C'mon. That swivel stool could clearly be from the 1800s.
 
Seeing as how the first one was all wood in the 1840's, that may not be that far off. Perhaps by 2140, they will be in our holographic watches?
 
Here's a bit more information about Setun.

Here is a public domain ternary computer emulator whose 33 page documentation goes into even more detail.

And here is the mother lode: A 186 page PDF that documents the recent Ternary Computing Testbed effort. It describes troolean algebra, dyadic logic gates, flip-flap-flops, and more. May the trits, tribbles, and trytes be with you.
 
I would stay away from the math and explaining how exactly the computer becomes conscious. Otherwise those with a math or computer science background will likely find it "silly". I'd make the reason for consciousness vague, and stay away from the binary/floating point accuracy stuff.

Well, at some point there's enough data and processing to simulate consciousness to effectively it makes moot whether or not it's real.

Or it doesn't.
 
The discussion about terary computers interests me, but I haven't had enough free time to read any of the links. One day!

Well, at some point there's enough data and processing to simulate consciousness to effectively it makes moot whether or not it's real.

Or it doesn't.

But you need a lot more than just data and processing power. You need to design something that will be conscious.

People like Kurzweil seem to think that all you need is processing power. That to me seems to be a highly flawed way of thinking.. but mind you I don't think that's what you believe, it just made me think of him.
 
The discussion about terary computers interests me, but I haven't had enough free time to read any of the links. One day!



But you need a lot more than just data and processing power. You need to design something that will be conscious.

People like Kurzweil seem to think that all you need is processing power. That to me seems to be a highly flawed way of thinking.. but mind you I don't think that's what you believe, it just made me think of him.

I'm not sure if there is or isn't a magic switch outside of size and complexity that creates a thinking brain.
 
I'm not sure if there is or isn't a magic switch outside of size and complexity that creates a thinking brain.

But stuff doesn't just arise out of nothing just because of complexity, that's never happened anywhere. You need some sort of a design first. In the case of our brain it's been evolving and getting finetuned over millions (billions?) of years.

I agree that if you had a computer as complex as our brain and run it through a simulated evolutionary path similar to what our brain went through, you might just get sentience.. Probably not the first time and probably not in the first decade of trying - you'd really need to finetune everything just the right way.

But there's no way you can just sit it down and expect anything to happen. You either need to duplicate the way our brain was "designed" or try a design of your own.
 
At the most basic level computers operate by performing mathematical and logical operations on various inputs. Everything else stems from that. As multiple others have posted using binary is convenient given the laws of physics and chemistry that govern silicon (and other types of non-quantum computing devices). But in theory you can operate on the inputs (which are just numbers) in any base. It is just impractical design logic gate structures to do so, so by convention we use binary.
 
As the articles on ternary computers above show, it isn't impractical at all to design computers with three logic states instead of two.
 
As the articles on ternary computers above show, it isn't impractical at all to design computers with three logic states instead of two.

Totally true. Especially if you use the "third" state for negative states, which can even replace signed integers and floating points.

The problem is - much like the problem of the x86 architecture - is that no one is really interested in rewriting software to this architecture. Essentially, it is a case of rational irrationality: We know it's wrong, yet it is wrong to switch from it.
 
Back
Top Bottom