Cumulative Computer Quiz #1

Originally posted by starlifter

BTW, an "acronym" means what is the abbgeviation. Examples:

BTW = By The Way
USA = United States of America
MS = MicroSoft
ROTFL = Rolling On The Floor Laughing


startlifter, you are very knowledgeable about computers. But actually an "acronym" is a special kind of abbreviation. RADAR is an acronym because you pronounce it like it looks. IBM is an abbreviation, not an acronym. (Although there is a kind of IBM acronym used in a joke I heard in Spanish in Mexico. In Spanish the letters IBM sound like "Y ve me" as in "Y ve me a traer una cerveza...").

I have heard the word "acronym" used very often in the way you do. Maybe that will become a meaning of the word if enough people use it that way...

;) ;) ;)
 
Originally posted by Lucky

"Get OP-code from memory" is always the first cycle, the M1-cycle. Correct!

I don't think it's quite that simple any more. Modern cpus prefetch chunks of memory containing instructions and try to decode some things in advance, and can decode multiple possible instruction paths in parallel.

Of course, you do have to get the instructions from memory, but they don't always process instructions sequentially. That is, they might be processing the opcode on one instruction, fetching a memory operand for another, and performing an immediate operation like shifting on another, all simultaneously.
 
Originally posted by Serutan
Ok, this shouldn't be horrible, 3 parter.

1. Who came up with the first design for what we would call
a computer?

2. Who built the first working computer?

3. When was it built?

1. I guess the conventional answer would be Babbage. But I could throw out names like Jaquard, Von Neumann too.

2. The government

3. early 1940s
 
Originally posted by sumthinelse


startlifter, you are very knowledgeable about computers. But actually an "acronym" is a special kind of abbreviation. RADAR is an acronym because you pronounce it like it looks. IBM is an abbreviation, not an acronym. (Although there is a kind of IBM acronym used in a joke I heard in Spanish in Mexico. In Spanish the letters IBM sound like "Y ve me" as in "Y ve me a traer una cerveza...").

I have heard the word "acronym" used very often in the way you do. Maybe that will become a meaning of the word if enough people use it that way...

;) ;) ;)

This is a bit OT ;)

But I must disagree - somewhat! IBM is an acronym! An abbreviation is a word that has been shortened, by removing letters. Eg. Doesn't is an abbreviated version of does not. An acronynm is where the first (usually) letters of several words are used to represent the other words. IBM is an acronym for International Business Machines. Because letters have been removed from the whole, it is an abbreviation, but it is still and acronym as well. :)
 
ainwood, I looked in Webster's dictionary and it agrees with you about "acronym." My Oxford dictionary agrees with me, so I'll say we are both right. And starlifter too. You are probably more right than I am, though. Oxford is for British english. I'll shut up about this now. :)
 
1. Who came up with the first design for what we would call
a computer?

2. Who built the first working computer?

3. When was it built?
1. A dude named Babbage. he developed an idea of the Difference Engine, but it was his Analytical Engine (general-purpose steam powered computer)that is actually the first design of what most would call a modern computer.

2. The first actual working "computer" is much debated, but most people settle on ENIAC, though the case can be made for the Harvard Mark I or the English Colossus as well. I don't remember who the actual designer of ENIAC was, but a dude named Ekhart was part of the effort.

3. ENIAC came online in about 1944, but the Brits started using Colossus in 1943 to help crack the German Enigma messages, as part of the ULTRA effort. The world did not know about Colossus until decades later, and in fact the British used the Colossus until about 1964 or so against the Soviets, then destroyed the last machine and most records of it.

Personally, I give the nod to the Brits. But 60 years later, the computer seems to have a hundred fathers, LOL.
 
Just for completeness about "acronym"....

From the American Heritage Dictionary:
ac·ro·nym
n.

A word formed from the initial letters of a name, such as WAC for Women's Army Corps, or by combining initial letters or parts of a series of words, such as radar for radio detecting and ranging.

Source: The American Heritage® Dictionary of the English Language, Fourth Edition
Copyright © 2000 by Houghton Mifflin Company.

from Merriam-Webster's Collegiate Dictionary:
Main Entry: ac·ro·nym <http://www.webster.com/images/audio.gif>
Pronunciation: 'a-kr&-"nim
Function: noun
Etymology: acr- + -onym
Date: 1943
: a word (as NATO, radar, or snafu) formed from the initial letter or letters of each of the successive parts or major parts of a compound term

© 2002 by Merriam-Webster, Incorporated
Merriam-Webster Privacy Policy

Abilty to "pronounce" the acronym is not a requirement.... Sidenote: the US navy likes to pronounce them by taking more than just the 1st lettter (e.g., COMSUBPAC), the US Air Forces does not care (e.g., ACM).


;)
 
starlifter, I aplogize for the digession. All the examples you have given agree with my initial premise (wac and snafu are pronounceable, IBM isn't, at least in English) but more than one definition refutes it. I thought I was right but admit I am wrong.

Your answers about the first computer are better than mine too, but I hope the author accepts mine anyway, because I have a good question to pose. :) ;) ;)
 
Your answers about the first computer are better than mine too, but I hope the author accepts mine anyway, because I have a good question to pose.
His question was kindof too general, as there is a great deal of "debate" about what is the best/first/actual etc. computer. Depending on how one "defines" computer (e.g., electical, electronic, mechanical, programmable, electrically programmable, etc. etc. we can both easily be way off of what he's looking for. But if he takes my answer, I'll pass it on to you, and you can ask your question. The real fun for me is reading the questions and answers you guys come up with!! :)
 
Originally posted by starlifter

1. A dude named Babbage. he developed an idea of the Difference Engine, but it was his Analytical Engine (general-purpose steam powered computer)that is actually the first design of what most would call a modern computer.

Yep. I noticed a mention of the Jacquard loom; but while I think it should be
considered part of the evolutionary tree that lead to computers, I don't think it
should be considered a computer itself.

Originally posted by starlifter


2. The first actual working "computer" is much debated, but most people settle on ENIAC, though the case can be made for the Harvard Mark I or the English Colossus as well. I don't remember who the actual designer of ENIAC was, but a dude named Ekhart was part of the effort.

3. ENIAC came online in about 1944, but the Brits started using Colossus in 1943 to help crack the German Enigma messages, as part of the ULTRA effort. The world did not know about Colossus until decades later, and in fact the British used the Colossus until about 1964 or so against the Soviets, then destroyed the last machine and most records of it.

Personally, I give the nod to the Brits. But 60 years later, the computer seems to have a hundred fathers, LOL.

Not what I had in mind. The one I have in
mind wasn't really practical becuase it was too slow. But IIRC it was
quite reliable, and did see service during WWII.

I guess there's no satisfying some people. Lucky was
crucified for being too esoteric, and I get hammered for being too vague. :D
 
There's nothing wrong with yours or Lucky's questions :).... we all learn something new from them, heheh...

But you'll have to give us a hint, narrowing down what you're looking for. As I recall, the Harvard Mark I was about 500 times slower that ENIAC. But many people consider the Mark I a calculator, and not a computer, and I'm in league with them.

I personally consider Babbage's Analytical Engine the first true computer ever designed, though obviously it is neither electronic nor electromechanical. In fact, it was never fully built.
As far as I'm personally concerned, even though it was very specialized, I consider the Brit's electro-mechanical marvel of Colossus to be the 1st modern working computer, despite the storage and transport medium.

Sooo...... maybe you can describe what you view as the definition of computer, e.g., mechanical, electronic, electro-mechanical, programmable, etc.

:)
 
OK, in addition to the hint already given, the computer was electromechanical in nature. The person whose name I seek
tried to get funding for a fully electronic computer, but was turned down.

As to Babbage, that's why I said "designed" rather than "built" in my question, as the Analytical Engine never left paper. Also the reason I didn't use words like "electronic", etc. You did note that I said Babbage was correct, yes?

If there's no answer by lunch tomorrow (in the GMT-7 zone), I'll post the answer, and turn it over to Sumthinelse, since he got Babbage first.
 
starlifter, I sent you email (I am peebles@texas.net) with important info about the next question! Please check your email if you can!

I didn't get an e-mail yet. Maybe CFC is messed up with the e-mail part. No PM from you, either. But the BBS has been cantankerous for over 7 hours now. I think I'm almost the only one who can post at the moment, albiet with mucho difficulty.

:)

If there's no answer by lunch tomorrow (in the GMT-7 zone), I'll post the answer, and turn it over to Sumthinelse, since he got Babbage first.
No worries :) :)

:goodjob:
 
Originally posted by starlifter


I didn't get an e-mail yet. Maybe CFC is messed up with the e-mail part. No PM from you, either. But the BBS has been cantankerous for over 7 hours now. I think I'm almost the only one who can post at the moment, albiet with mucho difficulty.


Yup. I think you were. The quoted post appeared about half
an hour after my dismal failure to get in.

Anyway...

The name I was looking for was Karl Zuse. In 1939, he built
a computer using binary arithmetic. But instead of vacuum tubes,
he used telephone switches for bits. The switches were, as
stated previously, reliable, but slow (~3 sec. for multiplications).
He and a friend attempted to get funding from the German government for a fully electronic computer, but were turned down
on the grounds that the war wasn't going to last long enough to justify such a long term investment (IIRC late 1940).

Sumthinelse is up.
 
The name I was looking for was Karl Zuse. In 1939, he built
a computer using binary arithmetic.
I would not have gotten that one :). The Colossus was largely built with telephone exchange parts, too.

Spycatcher34 was also able to post, as were a couple others during the morning, until about 2pm PST, when the whole BBS went kaput, LOL.
 
Greetings from the Mekong River! Since this connection is so awful here, I will just make it brief.

COBOL, PASCAL, C, JAVA, etc. are called computer "languages" even though they just tell the computer what to do.

NETBIOS, IPX, AppleTalk, TCP/IP, and SMB are examples of something that is more like a human "language" but we use a different term for these things.

>>> Question: What term do we use instead of "language" for these things?

I'll be back early Monday. Sorry but I can't connect sooner. If you answer this correctly before then and you are sure of your answer, go ahead and post the next question.
 
Originally posted by starlifter
Protocols?

Correct. This was my intended post but the connection on the Mekong was slow and I didn't have my computer there:

There are 3 questions here. The first should be the easiest, the second a little harder, and the 3rd very difficult. In fact, you can ignore the third question entirely, and answer/not answer any of the questions. The 2nd and third questions are tie breakers.

  • 1. To ease the effort to program computers, what we call computer "languages" appeared. Examples are Intel, 68000, etc. assembler language, C, FORTRAN, JAVA, etc. But these are not analogous to human languages. Whaat we call a computer "language" is a collection of mnemonics (words) used to control a computer. Human languages, on the other hand, are 2-way and have many more uses than just telling a person what to do.

    There are "languages" by which computers can "talk" to each other. Examples are TCP/IP, NETBIOS, Appletalk, IPX, and Bill G's favorite, SMB/CIFS.

    Question: What is the common term for all these "languages"?
  • 2. What is a "Sniffer"?
  • 3. (Warning, this is a tough one and it's not related to the first 2!) The Intel 386 (80386) processor had a bug in one of its instructions. The "pop flags" instruction (popf) sometimes allowed interrupts to be enabled during the execution of itself, even if interrupts were disabled before and after the instruction executed. This was a very nasty bug which caused "protected" blocks of code to be non-atomic. I don't know who it was, but somebody found a way to restore flags without enabling interrupts. This instruction sequence replaced only the popf instruction.

    Question: What was the instruction sequence used as a workaround instead of the popf instruction?

    starlifter, have already answered #1 correctly. You can either post a new question or try to answer #2 and/or #3. Up to you.
 
[*] 2. What is a "Sniffer"?
A sniffer is a software that monitor network packets, and identifies certain patterns, typically addresses, that pass.

Question: What was the instruction sequence used as a workaround instead of the popf instruction?
Don't know, but a guess would be a NOP. I still have an original 486 that was defective. I never had it replaced following the 486 defect mess.



My question:

You've heard of an atomic clock, but what is a dot clock? That is, what does the term mean, and what affects it? And what 2 pieces of hardware are most affected by the dot clock? Why is it important to your eyes?
 
Back
Top Bottom