• Our friends from AlphaCentauri2.info are in need of technical assistance. If you have experience with the LAMP stack and some hours to spare, please help them out and post here.

Learing Programming

One thing C++ certainly isn't, is a good language for learning programming.
 
I'm not sure that's a reason to not pick a language (and I say this as someone who hates Flash). I mean, lots of devices have been sold that don't support Flash - or indeed, many languages. It would be like saying don't pick a language because it's not supported by 1% of Linux users, or even more significantly, don't write anything that's Windows only, because you "only" get 90% of the market. And you certainly wouldn't write for anything that only supports OS X or IPhones, because of all the other commercially successful platforms like Windows, Android, Symbian etc. Surely just because some device somewhere has had commercial success (which doesn't say very much) doesn't mean you have to support it or you're doomed.

As languages go, I would guess that the number of devices that can run Flash is higher than many languages out there (simply because it runs on most desktops and smartphones, where as many languages only run on one platform).

As for the whole "Apple killed Flash" story, I'm not convinced. Firstly the credit should be to the emergence of open standards like HTML 5. The problem with the Apple argument is that websites have catered for those devices by writing their sites in the form of binary executables that only work on an Apple device. Firstly this isn't exactly a step forward for open standards or accessibility, and secondly, it doesn't provide any argument for killing Flash (since Apple devices are now catered for specifically, you're still free to use Flash). Moreover, there are hundreds of millions of Internet capable feature phones, not just Apple, and these are catered for by mobile specific versions of the websites - and it's here you see the use of HTML, but without Flash. Plus the *big* problem with this logic is that, if it were true, it would mean we shouldn't see the aforementioned Apple-only apps. But evidently people have no problem locking out some platforms.

The "Apple will kill Flash" story has been around for years - this smells like one of those stories that the media hype as a prediction before, then it gets suddenly accepted as a fact based on all the previous stories (e.g., the "two billion will watch the UK royal wedding", then after news reports simply requoted the two billion figure). Apple benefit significantly from this, as they get tonnes of media hype even before a product is announced ("ISlate", "IPhone 5"), along with claims about how these devices will be revolutionary etc; later, the large amount of coverage is cited as "evidence" that it was therefore revolutionary, despite the hype preceding the devices.

But anyhow. I'd also note that the evil of Flash was when people were writing their websites in Flash (same as they now do with Apple-only apps). If you actually want to write a online game, I don't see that Flash is bad. We might prefer that they choose HTML 5 now, but I don't think that kind of quibbling is more relevant than say, preferring someone choose Java over C#, or OpenGL over Direct X. It's not going to stop the majority of people playing it.

Apple devices can't even support Java either, is that a bad language too? I don't think so. Whilst it gets less hype these days, it's still a widely used language, that's cross-platform to a large number of popular commercially successful platforms, and can be used for gaming too.

Apple did not single handedly kill flash, but even if it did, that's not my main point. My main point is that flash is a waning program. Regardless of whether apple did it or if the market at large did is, is irrelevant. What matters is that flash is going under. Why would anyone want to learn a language that is clearly going to lose? Again, I'm not an apple fanboy. In fact I'll even go as far to say that I think apple computers are pretty darn overpriced. I'm just saying that for whatever reason, flash is going down.

As to your example of learning a windows program with only 90% of the market share... 90% is a lot.
 
I'm learning to program at the moment, and using Java. One good thing about it is that you get to do pretty complex stuff and have graphical interfaces quickly.

I gathered the thing to do if you're into maths, is to learn C. Is this correct?

EDIT: i mean, lol, be4 u can code, u gotta learn to c.
 
C99 is good for maths, and C++ has some good maths libraries too.

Wild pointer usage in C (pre-99) prevented some optimistations that FORTRAN compilers could do way back, so there's still a lot of FORTRAN code about. Now that C99 (and most C++ compilers via nonstandard extensions) support restrict pointers, C has caught up again.

There's also some nice parallelism libraries and facilities for offloading computation onto GPUs/floating point units these days.

This is irrelevant if you don't care about execution speed though, use Mathematica/Matlab or something ;)

EDIT: C is pretty easy to learn if you know pascal, anyway.
 
I don't know pascal, but learnt some basic when I was kid. ;)

I don't at the moment need it for anything, but read that it would be appreciated in the private sector mathematician jobs. Mathematica seems to be to, when I was applying for a job in insurance company last summer, they said they're primarily just programming with it. Unofrtunately I had used it only to calculate things.

I don't think I'm yet going to learn C, and they probably have a ourse on it at the autumn in the uni, but for the future: Is it wise to learn pre 99 C? Is there anything equivalent of the The Programming with C language by K&R for C99 or C11? It looked very nice (and concise).

Have to say, programming is great fun. At least at this stage. I've already made a matrix calculator and stuff. First I made it just as a procedural, but then when I learnt about objects, I had a revelation that it'd be good to make a vector object at least to simplify the multiplications. :D
 
C99 is very similar to C pre-99 really.

Operator overloading is nice in C++, yes ;)

You shouldn't make everything object oriented though - there is no instance of a "Maths" object... Vectors already exist in C++, as resizeable arrays, what you probably want is a std::valarray though.
 
You shouldn't make everything object oriented though - there is no instance of a "Maths" object... Vectors already exist in C++, as resizeable arrays, what you probably want is a std::valarray though.

Nah, the point of it that I wanted to do it myself. I'm on a programming course at the moment, and the exercises are too easy, so I started to do own stuff. And also some of the algorithm stuff of another course.

I don't know if they object thing was so necessary there, but it was more fun than making person object with two variables, name and age, and corresponding getters.
 
An object needs to have state information.

So a vector class is good, but something like trigonometry doesn't require objects. In languages where everything has to be a class, you'd make a load of static functions inside a class. In C++ you'd just use free functions (i.e. not a member of a class, but perhaps in a namespace).

EDIT: A class could just define an interface instead of holding state. That's known as an abstract class in C++, I think Java and C# both have interface keywords instead (but neither allows proper multilpe inheritance, only single inheritance + inherit as many interfaces as you want).
 
I'm learning to program at the moment, and using Java. One good thing about it is that you get to do pretty complex stuff and have graphical interfaces quickly.

That to me sounds a bit dangerous. A lot of people just dive into programming and think that all there is to it are if statements, loops, and variables.

I think you really need to get away from the keyboard and read up on fundamental principles like polymorphism, classes, constructors, client/server etc. before getting into anything too complex. It will make you a good programmer.
 
That to me sounds a bit dangerous. A lot of people just dive into programming and think that all there is to it are if statements, loops, and variables.

I think you really need to get away from the keyboard and read up on fundamental principles like polymorphism, classes, constructors, client/server etc. before getting into anything too complex. It will make you a good programmer.

Well, actually, Java (not Javascript, which is totally unrelated) is a good choice if you're trying to learn general object-oriented programming, at least by virtue of the fact that almost every other language is worse for beginners. Most introductory college-level courses in programming are based on Java -- unless there's been a big switch in recent years of which I'm unaware. The JVM insulates you from the underlying platform, you don't have to worry much about memory management, you have reasonable ways of dealing with errors, etc.

You definitely don't want to try to learn programming on a weakly-typed, highly dynamic, interpreted language like Python where you don't discover your typos until runtime. And you don't want to start with something like C++, which is a little like getting behind the controls of a 747 when you've never flown before. If you go to either of those extremes you'll probably pick up a lot of bad habits that you want to avoid.

However, I do agree with the point about learning about programming in general without reference to a specific language.
 
I think you really need to get away from the keyboard and read up on fundamental principles like polymorphism, classes, constructors, client/server etc. before getting into anything too complex. It will make you a good programmer.

That's probably the next step: we had a six week course which was the basics, and now there's going to be another six weeks for inheritance etc. I might also read something on it, since they seem to have pretty easy attitude on those courses.
 
Well, inheritance can be abused as well.

It is the programming equivalent of "is a" as in, a cat is an animal (so is a dog).

But it breaks down if the "is a" relationship adds extra constraints... a square is a rectangle, but a rectangle class will have width and height, whereas a square doesn't need both, so this would be a bad use of inheritance.

EDIT: Think it is time for a "Let's discuss computer programming" thread...
 
One thing C++ certainly isn't, is a good language for learning programming.

I disagree. You don't have to start off with all the bells and whistles. You can write straight C code in a C++ disguise to learn the basics. Once you're comfortable with
that, then you can move on to classes, .NET :vomit:, MFC etc. .

I too learned back in the days when you needed to know assembly, and it was
useful to be able to read (and do simple patches in) machine code. Ah,
the good old days. These modern languages hide too much from the programmer. :old:
 
MFC? Blimey. Everyone hates MFC ;)
 
Well, actually, Java (not Javascript, which is totally unrelated) is a good choice if you're trying to learn general object-oriented programming, at least by virtue of the fact that almost every other language is worse for beginners.

I think that's actually what the University I was studying at switched to right after I was done: most of the instruction up to that point (for introductory courses anyway) was done in OO Turing and OO Pascal. I'm talking about "Learn how to program" courses, we were thrown right into C++ & Java in first year, without any sort of "learn how to program" stuff.
 
C/C++ - Mostly used for DOS/Command Shell applications or to code the Windows API (interface) from scratch. Hardcore old-school programmers swear by this method. (even more hardcore programmers swear by Assembler)

Visual Basic .NET - Mostly for business applications that need to be built quickly. (BTW, even though I've learned about recursion, shift-bits, tree sorting, linked lists, double pointers (and heard of triple pointers :scared:), I've yet to use them, or even see them in the business world.)

VB.NET allows triple pointers? So different from the VB6 I use! I know I've used a triple pointer in C at least once. It did its job nicely, and no one even complained about it.

Is C/C++ really used that much for Windows API stuff? I guess I wouldn't know if it is, since most of my C/C++ coding has been on Linux, and most of my other language stuff has been on Windows or Unix.

You can't use .NET for plain C. You can use the nasty C++/CLI extensions to do managed C++ in .NET though.

I'm almost certain you can. Just change the file to .c instead of .cpp. Seems to work in a quick test. stdio.h and printf work like magic; try to use iostream and I get errors like crazy. I have a few old projects in .NET that use plain C, but don't have all the libraries installed to test them now.

I think the caveat is that if you use plain in .NET, you are limited to a fairly old version of C (C89, perhaps?). Which can be mildly inconvenient if you're used to writing for a more recent version or are porting a program. But, if you're willing to live with that, it's possible.

And the alternate of using C++ but ignoring the C++ features will get you most of the way, too.

Apple devices can't even support Java either, is that a bad language too? I don't think so. Whilst it gets less hype these days, it's still a widely used language, that's cross-platform to a large number of popular commercially successful platforms, and can be used for gaming too.

Not true. Apple computers do support Java. Even when they are running OSX.
 
We use C++ on Windows for games tools (we have our own lightweight framework - based on MFC but much cut down, predated the [now no longer supported] WTL).

Some GUI heavy tools use C# though. The stuff that does all the hard work (exporting animations, meshes, etc.) use C++.
 
Back
Top Bottom