ParadigmShifter
Random Nonsense Generator
One thing C++ certainly isn't, is a good language for learning programming.
I'm not sure that's a reason to not pick a language (and I say this as someone who hates Flash). I mean, lots of devices have been sold that don't support Flash - or indeed, many languages. It would be like saying don't pick a language because it's not supported by 1% of Linux users, or even more significantly, don't write anything that's Windows only, because you "only" get 90% of the market. And you certainly wouldn't write for anything that only supports OS X or IPhones, because of all the other commercially successful platforms like Windows, Android, Symbian etc. Surely just because some device somewhere has had commercial success (which doesn't say very much) doesn't mean you have to support it or you're doomed.
As languages go, I would guess that the number of devices that can run Flash is higher than many languages out there (simply because it runs on most desktops and smartphones, where as many languages only run on one platform).
As for the whole "Apple killed Flash" story, I'm not convinced. Firstly the credit should be to the emergence of open standards like HTML 5. The problem with the Apple argument is that websites have catered for those devices by writing their sites in the form of binary executables that only work on an Apple device. Firstly this isn't exactly a step forward for open standards or accessibility, and secondly, it doesn't provide any argument for killing Flash (since Apple devices are now catered for specifically, you're still free to use Flash). Moreover, there are hundreds of millions of Internet capable feature phones, not just Apple, and these are catered for by mobile specific versions of the websites - and it's here you see the use of HTML, but without Flash. Plus the *big* problem with this logic is that, if it were true, it would mean we shouldn't see the aforementioned Apple-only apps. But evidently people have no problem locking out some platforms.
The "Apple will kill Flash" story has been around for years - this smells like one of those stories that the media hype as a prediction before, then it gets suddenly accepted as a fact based on all the previous stories (e.g., the "two billion will watch the UK royal wedding", then after news reports simply requoted the two billion figure). Apple benefit significantly from this, as they get tonnes of media hype even before a product is announced ("ISlate", "IPhone 5"), along with claims about how these devices will be revolutionary etc; later, the large amount of coverage is cited as "evidence" that it was therefore revolutionary, despite the hype preceding the devices.
But anyhow. I'd also note that the evil of Flash was when people were writing their websites in Flash (same as they now do with Apple-only apps). If you actually want to write a online game, I don't see that Flash is bad. We might prefer that they choose HTML 5 now, but I don't think that kind of quibbling is more relevant than say, preferring someone choose Java over C#, or OpenGL over Direct X. It's not going to stop the majority of people playing it.
Apple devices can't even support Java either, is that a bad language too? I don't think so. Whilst it gets less hype these days, it's still a widely used language, that's cross-platform to a large number of popular commercially successful platforms, and can be used for gaming too.
You shouldn't make everything object oriented though - there is no instance of a "Maths" object... Vectors already exist in C++, as resizeable arrays, what you probably want is a std::valarray though.
I'm learning to program at the moment, and using Java. One good thing about it is that you get to do pretty complex stuff and have graphical interfaces quickly.
That to me sounds a bit dangerous. A lot of people just dive into programming and think that all there is to it are if statements, loops, and variables.
I think you really need to get away from the keyboard and read up on fundamental principles like polymorphism, classes, constructors, client/server etc. before getting into anything too complex. It will make you a good programmer.
I think you really need to get away from the keyboard and read up on fundamental principles like polymorphism, classes, constructors, client/server etc. before getting into anything too complex. It will make you a good programmer.
One thing C++ certainly isn't, is a good language for learning programming.
Well, actually, Java (not Javascript, which is totally unrelated) is a good choice if you're trying to learn general object-oriented programming, at least by virtue of the fact that almost every other language is worse for beginners.
MFC? Blimey. Everyone hates MFC![]()
C/C++ - Mostly used for DOS/Command Shell applications or to code the Windows API (interface) from scratch. Hardcore old-school programmers swear by this method. (even more hardcore programmers swear by Assembler)
Visual Basic .NET - Mostly for business applications that need to be built quickly. (BTW, even though I've learned about recursion, shift-bits, tree sorting, linked lists, double pointers (and heard of triple pointers), I've yet to use them, or even see them in the business world.)
You can't use .NET for plain C. You can use the nasty C++/CLI extensions to do managed C++ in .NET though.
Apple devices can't even support Java either, is that a bad language too? I don't think so. Whilst it gets less hype these days, it's still a widely used language, that's cross-platform to a large number of popular commercially successful platforms, and can be used for gaming too.