Does using a computer at a young age make you more proficient?

aimeeandbeatles

watermelon
Joined
Apr 5, 2007
Messages
20,112
I got curious and tried to google this and got stuff about computer education in school which wasnt what I wanted. So Ill post it here:

At a young age (I'd say 5-10), does using the computer informally (messing around with it rather than doing education in school) make you more proficient later on?

This may be an interesting discussion. Someone suggested to me the reason Im fairly good at computers is because I started using them at 5. And I was typing even before that.
 
Don't think so. I only had access to a PC in my university years, and yet today I count myself as being reasonably competent at using one. It just takes effort to learn, like everything else.
 
I think it all depends. I used computers at young age, including rudimentary programming. It led to an early adulthood pursuit of computer languages, but I switched degrees, and as an adult, I have no programming skill, so I'd say it depends (just memories of hacking in assembly and Pascal).

For instance, I grew up with computers, but learning to Google info was like a culture shock for me since Google was something I just barely missed attending school (it was too new) and I was largely computer illiterate for years until computer use became important at a job.

So maybe it helps, but like anything, it's all perishable skill. And some detailed topics are only tangentially related to other topics, so knowing one is not much of an advantage over another. But on the OTOH, I'd say I took to DIY computer building because I understood computer architecture from ASM programming.
 
No, not at all is probably the right conclusion. I doubt it really even helps people as typists unless a kid is formally trained. However, I could see the argument that learning programming skills and related mathematics and computer science at a younger age might be very helpful, especially concepts like recursion, but that's very much different from what the OP describes.
 
Interesting thing: Learning languages is easier at a younger age. Do programming languages count?
 
Getting a familiarity with using computers at a young age certainly cannot hurt, but I don't think it automatically makes you more proficient later on in life.

I don't really know what "better at computers" means, though. A computer is a tool, not a skill in and itself - you don't get "better at hammers" or "better at screwdrivers," but you might improve your carpentry skills (which involve those tools) through practice and experience.
 
Yes, obviously. Just like playing an instrument at a young age makes you more proficient. There are two reasons: 1) If you start off early you get more practice, and 2) it's generally easier to learn things when you're young than when you're older. Even if you don't believe the second argument, the first argument is trivially true and obvious.
 
Sure. Depends on what you're doing with it though. I was messing around with a bunch of different things on old PC clones (286, 386, 486) as I grew up and I became proficient with DOS. Did rudimentary C++ and some 3D modeling, back when you wrote lines of code to make 3D images rather than using photoshop... ah those were the days. (Actually they weren't because they sucked and were super slow and complicated and made no sense but you get the point...)

Then Windows 95 came along and DOS proficiency (knowing how to mess with autoexec and other start up files, batch processes, virtual memory, yada yada) went out the window, or at least were relegated in importance since this was the first windows (IIRC) that integrated them both. I.e. you did not boot into DOS and then execute a command to run windows. And "the registry" became the important thing to know how to tinker with. I had to learn it all over again but this was around the time that I started high school and other things, like being outside and trying to make girls let me touch their boobies, became more important to me.

But I would say these early experiences made me more proficient, or perhaps less intimidated by, computers. But nowadays the difference seems to be more 1) people who know nothing; 2) people who know a little bit; 3) people who are smart enough to google it and figure it out on their own; 4) people who know enough to manage 90% of the issues they come across and google the ones they do not know; and 5) experts who exist on another level than us plebians. My prior experience puts me in the 3 to 4 range.
 
Well, having used every kind of system from VMS to Windows 7, I think that there are two things worth mentioning:
- from an operator's point of view, each system is different, and often so different that experience with one doesn't translate to another.
- from a computer scientist's (and, to some degree, a software developer's) point of view every system and piece of software relies on the same underlying concepts and experience with one will be of at least some use with others.

What most people understand by "using a computer" is the first. I don't think it is worthwhile to go about acquiring experience with any particular system just for its own sake. Such experience is only worth acquiring on the job, of if a use for it is at least already planned. And if one intends to become a software developer but has no specific project planned, it is far better to learn mathematics and algorithms than any specific programming language. Donald Knuth's books are usually highly recommended, though probably too "heavyweight" for most developers. I never had enough interest to read them all.

There's one additional thing worth mentioning: the interesting phenomenon of the same solutions appearing over and over in increasingly smaller devices. The tendency is to initially program very close to the hardware but, over the years, most of the work to be done using more and more abstraction layers (which can be conceptually very different). But at the same time smaller devices are going through the same development but a few years "delayed": mainframes were the first to offer virtual machines, for example, then the desktops got them, now smartphones are also getting those. Someone used to program "close to the hardware for personal computes in the 1990s might put the same skills to use in mobile phones in the 2000s, and now in some house appliances in the 2010s. This phenomenon, though, may be reaching its limits now.
 
Theres a sort of computer science thing called WIMP -- Windows, Icons, Menu, Pointer I think it is. Butit dates back to the 1970s or 1980s and its still very useful today even though its not called that anymore I dont think. But thats interesting I think.
 
Yeah I agree with what aimee just said - there are some important concepts that persist. What you can click on in Windows 3.1 is basically the same as what you can click on on MacOS or Win 7 or any Linux interface. When you start using websites, the same concepts naturally persist, even though they don't really need to -- you still have a "menu" bar, dropdowns, etc that replicate what the OS is doing.

When my parents use a computer, they have literally no idea what they can click on and what they can't click on. It's completely alien to them, that this rounded thing here with "next" on it is called a "button" and you can click on it, but the word that says "installing program" is a "label" that you can't click on. They literally just see a flat canvas of colours and words, without any of the contours that define what "stands out" as something we more experienced users identify as "things I can click on". They would look at the word "quick reply" just above and to the left of this box that I'm typing in now, and click on that, wait 15 seconds, then wonder why on Earth nothing is happening. It's not intuitive for them to see those as just a "label", but this box here as a "text box" that I can type into.

Our generation is growing up having to develop new intuitions that are entirely separate to our evolved intuitions regarding the physical world. Other domestic technological breakthroughs in our parents' or grandparents' time, such as the internal combustion engine or washing machines, depended on physical interactions and could be understood by appealing to our innate physical intuition. I know that the chair I'm sitting on is different to the carpet it's standing on (and I'd know this with one eye shut or in a painting). I intuitively know that a bird flying is "alive" whereas a rock getting flung from a tree is "not alive". But without practice, I wouldn't intuitively know that the little things that say "quote" and "multi" to the bottom right of each post are "buttons" whereas the thing that says "Quick Reply" above this box I'm typing in is "not a button". You generate a new set of intuitions from practising with computers, just like with everything else. And the earlier you start, the more practice you get, and the more intuitive it becomes.

Probably the only other breakthrough in the field of technology that we had to develop a new set of intuitions for was electricity: it's not intuitive that rotating this magnet with wires wrapped around it can make that bit of metal over there start to glow. We now all know that a circuit must be complete, or what it means to be "grounded", and our parents understand this as well as we do (mostly). With plumbing or carpentry, you can just see how things work, without knowing any new concepts beyond a basic intuition about the physical world. But with things like electricity, computers, and cooking, it's hard to see cause and effect. It's easy to see that water goes down this thing here, spins that twirly thing, which turns that handle and grinds the corn. But it's much harder to see that if I wind more wires around the magnet, the bit of metal glows brighter, or if I click on this button here, I'll close that "window" (whatever that means), or if I mix all this crap together and put it in the oven for an hour, it'll turn into a pie. F-ing ovens: how do they work?!


@Illram: Yeah, obviously, moving from DOS to Windows is a huge jump, because one is a CLI and the other is a GUI. But almost nothing has changed between Windows 95 and Windows 7 - that's 15 years of near-identical user experiences! I dare say, someone in 1993 using Windows 3.1 could jump straight to Windows 7 without much fuss.
 
Well, Windows 7 is slowly moving into the silly ribbon-based interface. I got used to it but I really dont like it too much. I'm used to menus.
 
Interesting thing: Learning languages is easier at a younger age. Do programming languages count?

I'm now a student computer science and I must say that at my age I find it easier to acquire a programming language than when I was younger. I recall I was like age 10 when I attempted to learn C++, and I never went much further then Hello World.
However, I'm able to program much more complex programs in C++ and I am now pretty intermediate level with C#.
Besides, I figure skilled programmers over age 30 are quickly able to acquire a new programming language simply for the sake of being programmers.

So I think programming languages are usually an exception to the rule that everything is supposed to be easier to learn when you are younger.
 
I'm now a student computer science and I must say that at my age I find it easier to acquire a programming language than when I was younger. I recall I was like age 10 when I attempted to learn C++, and I never went much further then Hello World.
However, I'm able to program much more complex programs in C++ and I am now pretty intermediate level with C#.
Besides, I figure skilled programmers over age 30 are quickly able to acquire a new programming language simply for the sake of being programmers.

So I think programming languages are usually an exception to the rule that everything is supposed to be easier to learn when you are younger.

I tried reading my uncles notes once and noticed he writes in hexadecimal, is this anywhere near normal?
 
Back
Top Bottom