Cumulative 48-hour Computer Trivia

Originally posted by Lucky
Ok, here goes again:

Describe the OSI network reference model (all layers)! Is it the correct model for the

networks (e.g. internet) of today? If not, which model is and what are the differences?

:D

The layers (from the ISO site) are (descriptions are
fuzzy memories of networking class):

1. Physical layer
-- The actual hardware.

2 Data link layer
-- More or less the driver software for the HW.

3 Network layer
-- The protocol S/W

4 Transport layer
-- Interface between O/S and higher levels.

5 Session layer
-- More or less a UNIX socket.

6 Presentation layer
-- Interface between socket & user S/W.

7 Application layer
-- User application.

OSI is not the current Internet model. Although a nice abstraction, in practice a literal

implementation would be too
cumbersome (too many layers, too much overhead).

The current model is 1973's TCP/IP. It has (IIRC) 4 layers,
which (again IIRC) maps the following way:

TCP/IP level 1 maps to OSI level 1.
TCP/IP level 2 maps to OSI levels 2-4.
TCP/IP level 3 maps to OSI level 5-6.
TCP/IP level 4 maps to OSI level 7.
 
Correct and complete! :goodjob:

Continue please.
:D
 
Why are computer errors called "bugs"?
 
A long time ago, an early computer was malfunctioning. When they took it apart to figure out what went wrong, it turned out that a moth was causing the problem... it was a removed, but the the discussion about the insect in the machine stuck... and instead of moth or insect, they referred to it as a bug in the machine. It came to mean any general glitch in the hardware of the era (1940's). As hardware became more reliable, more and more glitches originated in the analogue wiring, and later in the digital software.... and today, it usually refers to a glitch (or unintended error) in software.

:)
 
As I remember it, the computer was ENIAC, and the moth
had shorted out a rack of vacuum tubes.

Tag, you're It!
 
You are probably right... I was not 100% sure about it, as I didn't take time to look it up :eek: ....


Hmmm.... What is the difference between USB 1.xx and the fairly new USB 2.xx? What are the approximate data speeds of each?
 
HINT: USB 2.xx is much faster than USB 1.xx. PS, You can look up things in this thread (e.g., you don't have to know it from memory ;) ).
 
OK, so no one wants to talk about USB 2.0, and since 48 hours is up, time for an anwer and a new question ;).

Answer:

USB 2.0 runs at speeds 40 times more than that of USB 1.1. USB 2.0 includes everything that USB 1.1 offers and adds a high-speed mode, which runs at 480Mbps. USB 1.1 supports two speed modes: 1.5 and 12Mbps whereas USB 2.0 has three of them: 1.5, 12 and 480Mbps. USB 2.0 also uses the same USB 1.1 compliant cables to connect high-speed devices. However, classic USB hubs will slow down USB 2.0 devices. In addition, a USB 2.0 host controller is required to enable the high-speed connection with a USB 2.0 device. Plugging a USB 1.1 device to a USB 2.0 hub is okay, but connecting a USB 2.0 device to a USB 1.1 hub is prohibited. USB 1.1 devices still operate at 12Mbps at full-speed and 1.5Mbps at low-speed on a USB 2.0 bus. Even though USB 1.1 devices won?t run any faster, they can work alongside of USB 2.0 devices on the same bus.

SOURCE: http://www.everythingusb.com/usb2/faq.htm



New Question:

What is the basic data transfer speed of a 1X CD ROM? (either KB or Kb)
 
I think it´s 150kBytes per second. :yeah:

At least my 4.4X old SCSI CD-ROM makes 660kB, so that must be it.
:D
 
Hmm, again, let me think. :enlighten

How about this:
What was the reason for the crash of the first flight of the Ariane V rocket?

It was a computer error, of course, but what exactly?
:D
 
Originally posted by Lucky
Hmm, again, let me think. :enlighten

How about this:
What was the reason for the crash of the first flight of the Ariane V rocket?

It was a computer error, of course, but what exactly?
:D

The Ariane V computers were loaded with Ariane IV software!
Not even the great and glorious :rolleyes: Ada language could
deal with that!

Strictly speaking, this was human error, even though the
error manifested itself in the computer system.
Remeber GIGO!!!
 
Hmm, yes and no. They used the older IV software but reworked it. They forgot to take out a few unneccessary steps. :eek:

BUT what was the computer error that lead to the desaster?

It was something every programmer has experienced at least once.
:D
 
Originally posted by Lucky
Hmm, yes and no. They used the older IV software but reworked it. They forgot to take out a few unneccessary steps. :eek:

It was something every programmer has experienced at least once.
:D

Must've been a cut-and-paste error then. The problem (IIRC) was that the S/W was reacting to sensor readings as if it were in
an Ariane IV instead of a V, and in IV terms, the data indicated
a situation that required self-destruct or some such.
 
Yes, that is correct, BUT not the actual computer program error! :p

The actual mistake was an overflow error when converting the altitude data from float to integer. It worked in the Ariane IV because it was not as high as the V in the same time.

So OVERFLOW ERROR, was what I wanted to hear. :yeah:

Go on anyway!
:D
 
OK, continuing in a jugular vein:

Why did Mariner 2 (First probe to Venus) fail?
 
Mariner 1 failed cause it strayed off the safe flight corridor on it's approach to Venus. Mariner 2's problems focused mainly on overheating issues but then contact was eventually lost. (Hey, back then computers normally required air conditioning, so it should have been expected.)
 
Yes, but there was a specific error that cause Mariner 2 to
lose contact. Hint: it was software related, and it can't
happen today.
 
Well, they probably did use paper tape, but the error I am referring to was in actual code. Again, it wouldn't happen
today (and *not* because the language doesn't exist
anymore).
 
Back
Top Bottom