A couple of (very basic) questions about computer code

A Deepness in the Sky's technology regime is looking increasingly likely. When I first read it in like 2003 or so I was like "naw that won't happen, they won't just keep tacking code onto code. they'll do super rewrites, right?"
:scared:
 
Totally true. Especially if you use the "third" state for negative states, which can even replace signed integers and floating points.

The problem is - much like the problem of the x86 architecture - is that no one is really interested in rewriting software to this architecture. Essentially, it is a case of rational irrationality: We know it's wrong, yet it is wrong to switch from it.

Actually it goes more like this: We know that parts of it are wrong, but almost every single piece of software written in the past 20 years runs on it and most of that software will never be rewritten. Ever.
 
So sad I missed this thread the first time around. :(

I think if ternary computers could be proven to be more efficient than binary computers, then they will eventually dominate, once we stop finding ways to make binary computers more efficient. And with moore's law due to expire within our lifetimes, that might happen sooner rather than later.

I'm skeptical that ternary computers would be more efficent though. A major limit on the speed of a computer is how fast it can switch a bit from off to on. And a way to optimize that is to decrease the voltage difference between on and off. However, if you add a third state, then your limit becomes how fast you can switch from +x to -x. Switching from +x to 0 would be faster, but you wouldn't be able to capitalize on that, because you don't know if the faster switch to 0 is really part of a switch to -x. So that inability to capitalize on efficiency suggests that a binary computer would be faster.
 
We know Ternary is better and binary is wrong? Can someone actually show a practical physical implementation that is more efficient? Or are you guys all talk and no transistors.
 
a way to optimize that is to decrease the voltage difference between on and off. However, if you add a third state, then your limit becomes how fast you can switch from +x to -x. Switching from +x to 0 would be faster, but you wouldn't be able to capitalize on that, because you don't know if the faster switch to 0 is really part of a switch to -x. So that inability to capitalize on efficiency suggests that a binary computer would be faster.
I think you're making an assumption here that may not be justified. Namely, ghat a switcheroo from -x to +x must pass through 0 as it does on the 2 dimensional linear number line.

Is it possible to switch directly, transistor-wise, between
+x -> -x
-x -> 0
0 -> +x

You know, polar coordinates instead of linear, clockwise and counter clockwise?

I'm as far from an EE as a shepherd, so perhaps this question is naive.
 
I am not familiar with the challenges that DNA bases computing would have to overcome. But I would think that there are no fundamental reasons why it cannot work. As with other technologies, there is a huge gap to the binary in silicon technology and I have no idea if DNA based computing will ever be as efficient as conventional computers.

I always wonder if there's any value in teaching specific types of math in order to 'prime' people into being able to program within certain types of computer design. Would a 3D computer chip need different programing from a flat wafer? Likely! And using DNA computing would likely need a different programming vibe. I'm sure raw shoe-horning could get us pretty far, don't get me wrong.

What I'm thinking is like how Physics becomes vastly easier once you have a grasp of calculus. Previously 'memorized' relationships become intuitive. Same thing with matrices and statistics.
 
I think you're making an assumption here that may not be justified. Namely, ghat a switcheroo from -x to +x must pass through 0 as it does on the 2 dimensional linear number line.

Is it possible to switch directly, transistor-wise, between
+x -> -x
-x -> 0
0 -> +x

You know, polar coordinates instead of linear, clockwise and counter clockwise?

I'm as far from an EE as a shepherd, so perhaps this question is naive.
It's not that one couldn't design a circuit that would switch directly between drawing current from the negative to the positive voltage. It's that the circuit would have to drive a capacitive load which requires it to go through all the voltages in between.

Think of it like filling a bucket. You can't go from an empty bucket (-V) to a full bucket (+V) without filling it halfway full first (0 Voltage)
 
I think you're making an assumption here that may not be justified. Namely, ghat a switcheroo from -x to +x must pass through 0 as it does on the 2 dimensional linear number line.

Is it possible to switch directly, transistor-wise, between
+x -> -x
-x -> 0
0 -> +x

You know, polar coordinates instead of linear, clockwise and counter clockwise?

I'm as far from an EE as a shepherd, so perhaps this question is naive.
You can switch directly in so far as you can make the clock long enough so the intermediate state of 0 cannot be seen. In fact you'd have to do this. But you cannot switch from +x Volts to -x Volts without going through 0.
 
I always wonder if there's any value in teaching specific types of math in order to 'prime' people into being able to program within certain types of computer design. Would a 3D computer chip need different programing from a flat wafer? Likely! And using DNA computing would likely need a different programming vibe. I'm sure raw shoe-horning could get us pretty far, don't get me wrong.

That would depend on the technology. Some computing paradigms require a totallz different mindset, for example when programming an FPGA, but still the same discret binary math. However other types will indeed require a different subset of math: If you have a quantum computer you are not going to get anywhere with discret math.

For high-level computing this won't matter much, anyway. Most of those optimization details are hidden in libraries, so it is possible to program without having to understand binary math.
 
Back
Top Bottom