Are C and C++ Losing Ground? - Slashdot
Close
binspam
dupe
notthebest
offtopic
slownewsday
stale
stupid
fresh
funny
insightful
interesting
maybe
offtopic
flamebait
troll
redundant
overrated
insightful
interesting
informative
funny
underrated
descriptive
typo
dupe
error
640970
story
Pickens
writes
"Dr. Dobbs has an
interesting interview with Paul Jansen
, the managing director of TIOBE Software, about the Programming Community Index, which measures the popularity of programming languages by monitoring their web presence. Since the TIOBE index has been published now for more than 6 years, it gives an interesting picture about trends in the usage of programming languages. Jansen says not much has affected the
top ten programming languages
in the last five years, with only Python entering the top 10 (replacing COBOL), but C and C++ are definitely losing ground. 'Languages without automated garbage collection are getting out of fashion,' says Jansen. 'The chance of running into all kinds of memory problems is gradually outweighing the performance penalty you have to pay for garbage collection.'"
Related Links
The State Of Grayware On the PC
Submission: Are C and C++ Losing Ground?
C/C++ Back On Top of the Programming Heap?
Bill Prohibiting Genetic Discrimination Moves Forward
This discussion has been archived.
No new comments can be posted.
Are C and C++ Losing Ground?
More
Are C and C++ Losing Ground?
Comments Filter:
All
Insightful
Informative
Interesting
Funny
The Fine Print:
The following comments are owned by whoever posted them. We are not responsible for them in any way.
C/C++ is dying!
Score:
, Funny)
by
KillerCow
( 213458 )
writes:
on Thursday April 24, 2008 @04:34PM (
#23188964
But does Netcraft confirm it?
Share
Re:C/C++ is dying!
Score:
, Funny)
by
Tackhead
( 54550 )
writes:
on Thursday April 24, 2008 @04:54PM (
#23189286
But does Netcraft confirm it?
No, but Stroustrup himself is reputed to have apologized for C++ as far back as 1998.
"It was only supposed to be a joke, I never thought people would take the book seriously."
- From the lost tapes of the legendary IEEE
interview
[nao.ac.jp] of 1998
:)
Parent
Share
Re:C/C++ is dying!
Score:
, Informative)
by
SatanicPuppy
( 611928 )
writes:
Satanicpuppy.gmail@com
on Thursday April 24, 2008 @05:01PM (
#23189376
Journal
1. Java.....20.5%
2. C........14.7%
3. VB.......11.6%
4. PHP......10.3%
5. C++.......9.9%
6. Perl......5.9%
7. Python....4.5%
8. C#........3.8%
9. Ruby......2.9%
10. Delphi...2.7%
The other 10 in the top 20 are:
JavaScript, D, PL/SQL, SAS, Pascal, Lisp/Scheme, FoxPro/xBase, COBOL, Ada, and ColdFusion
Parent
Share
Re:
Score:
, Interesting)
by
dfiguero
( 324827 )
writes:
Funny that, according to the site, Javascript is losing ground. I was thinking the opposite should be happening now that Ajax is so popular.
Re:C/C++ is dying!
Score:
, Insightful)
by
Anonymous Brave Guy
( 457657 )
writes:
on Thursday April 24, 2008 @05:55PM (
#23190276
Popular as in people using it, or popular as in lots of people writing about it?
Personally, I'm not convinced AJAX is that popular on the people-using-it count. It's a very useful technique for a particular niche, but it
is
only a niche.
Parent
Share
Re:C/C++ is dying!
Score:
, Interesting)
by
ATMD
( 986401 )
writes:
on Thursday April 24, 2008 @06:47PM (
#23191122
Journal
I wouldn't say it's a niche. I recently got around to learning AJAX and proper DOM scripting and now I want to use it for everything. It makes the UI so much nicer not having to reload the entire page every time you have anything dynamic.
Parent
Share
Re:C/C++ is dying!
Score:
, Informative)
by
leenks
( 906881 )
writes:
on Thursday April 24, 2008 @07:14PM (
#23191524
Lots of us (in enterprises at least) are realising (or rather, we are able to convince the project managers now) that webapps aren't the solution for everything, and that overall development time is often increased by the difficulties when developing in javascript / html.
Parent
Share
Re:C/C++ is dying!
Score:
, Interesting)
by
MBGMorden
( 803437 )
writes:
on Thursday April 24, 2008 @09:38PM (
#23192932
I agree. My educational background was in C/C++ programming (command line on Unix systems) and during and immediately following school most of my hobby programming was in Borland C++ Builder. About 2 years ago I discovered PHP and went wild with that. It was fun, but that old saying started becoming true: "When all you have is a hammer, everything starts looking like a nail.".
Web apps are nice and quick to develop, but I'm definitely starting to come back around to the idea that there is definately a place for local apps, and most certainly for fast compiled code over interpreted.
Parent
Share
Re:C/C++ is dying!
Score:
, Interesting)
by
PocketPick
( 798123 )
writes:
on Thursday April 24, 2008 @10:49PM (
#23193488
I completely agree - You know why I like C or C++? Because I only need to know one thing to do 90% of everything - C or C++. In the world of web development, I must no only be proficient in an equivilant amount of libraries found on a desktop platform, but also any number of scripting languages (PHP/JavaScript/Ruby/etc), HTML/XHTML/XML/SGL/DTD/RelaxNG/XMLSchema, perhaps ColdFusion, or maybe Adobe Flex/SilverLight - And I'm probably only scrapping the complexity of this odd little world.
Why the pain? Why not keep it simple? In spite of our advancement, it's amazing how much more practicality and common sense some software academics had 20 years ago compared to today.
When are web standards comittees or intellectuals going to quit trying to one up each other and start consolidating some of their standardizations?
Parent
Share
Re:C/C++ is dying!
Score:
, Insightful)
by
joggle
( 594025 )
writes:
on Friday April 25, 2008 @02:04AM (
#23194538
Homepage
Journal
It really comes down to different tools for different jobs. Having a vast number of tools at your disposal for free is not a bad thing, just get a cursory knowledge of each tool and what it's good for so that when your next project comes up you can make an informed decision on which one(s) to use.
Also, you make it seem like only knowing the C/C++ languages is sufficient to accomplish anything. That's really not true--at a minimum you need to know the STL for C++ related stuff, some GUI library for doing graphics, an XML library for doing XML manipulation, a database library for interacting with the database of your choice, a cross-platform library to write portable code, etc. Even if you're using something that does all of that (such as Qt) you still need to learn about XML, XMLSchema and DTD if you are using those technologies (just as you would for web programming).
Parent
Share
Re:C/C++ is dying!
Score:
, Insightful)
by
utopianfiat
( 774016 )
writes:
on Thursday April 24, 2008 @07:19PM (
#23191584
Journal
I was convinced that in the scientific programming world, people were still into Fortran as it grants a slight increase in speed over C for certain algorithms. Of course, this wouldn't have a broad _web presence_ due to the fact that realtime and mission critical applications aren't posted on the web. I think the limit of the fortran that still exists publicly would be the open source ATLASes and MATLAB clones (ie:matplotlib), as well as, of course, Linux itself.
Parent
Share
Re:C/C++ is dying!
Score:
, Insightful)
by
neokushan
( 932374 )
writes:
on Thursday April 24, 2008 @05:56PM (
#23190292
Surely it's not really a fair indication just because it's web presence is dropping? I could easily argue that Java is only so "popular" because more people are posting with problems they're having using the language and that C\C++ are only loosing ground because better information on using the language is already out there.
Parent
Share
Python+Fortran or JAVA+Groovy
Score:
, Insightful)
by
goombah99
( 560566 )
writes:
on Thursday April 24, 2008 @06:07PM (
#23190496
In the work I do--scientific calculations with a lot of fast numerics, , python + fortran seems like nirvana, as each overcomes the shortcomings of the other. One could just as well use C except the efficient numeric libs and memory layout give fortran an edge.
This of course is not the match made in heaven for everyone but numerical scientists should look hard.
What's so good?
Utility:
Well there's a strong base of numeric libraries in Python that are fortran array freindly so there's a good base to grow off of.
Second F2PY, which creates python modules out of fortran subs works so well it's almost transparent and painless. Even cooler is that because fortran compiles are ludicrously fast compared to C++, you can generate fortran code in the python compiler at run time and compile it one the fly for creating modules optimized to your problem.
Given you are wrapping in python, the availability of groovy C++ libs is not really very enticing at all given the pain you will pay for having to write the whole program in C++.
Practical:
Fortran as a stand alone language kinda blows for versatility and modern program architectures. But if all you are doing is writing a function then it's a sweet language because it's language syntax is so tight that it's harder to make a syntax error that compiles, and hard to logic errors seem to be less evasive than in C. (e.g. using i++ instead of ++i or doing I=4 instead of i==4 are not possible in fortran's limited syntax).
Thus you write functions and let python deal with all the memory management, human interface, file management, command line arg parsing and all the messy bits that end up being a lot bigger than the function where the program spends all it's time.
Fortran is also very optimization friendly since things like matrix multiplies and out-of-order loops are part of the core language.
This is debatable but I find that fortran seems to have a more logical memory order in 2-d arrays. Namely if you take a sub-array you get elements that are consecutive in memory and thus for most microprocessors will all get pulled into the cache on the same page. Slices of C-arrays have consecutive elements spaced by the row width apart in memory. One can of course find cases where one is preferred over another.
I do however which python had some way of optionally typing variables that was less cumbersome and slow than decorators or explicit run-time type checking. I virtually never write python that takes advantage of introspection or dynamic typing so the ability not to have some protection--optionally and just to debug--by type checking is annoying.
But If I were starting from scratch and did not have a compelling need for all those wonderful fortran numeric libs, I think the optimal choice in the future is going to be
Java+ Groovy.
basically you learn one syntax and get the best of both interpreted and compiled languages. Develop it in groovy then migrate the slow bits to JAVA. import all the great JAVA libs.
And since it's nearly the same syntax it's easy to read.
Parent
Share
Re:Python+Fortran or JAVA+Groovy
Score:
, Insightful)
by
goombah99
( 560566 )
writes:
on Thursday April 24, 2008 @06:58PM (
#23191268
Replying to myself because I forgot to add why I think Java+groovy has a big future.
The big achilles heel of python is that it currently truly sucks for multi-core programming and it would appear that attempts to solve this are not coming quickly. It's global interpreter lock means that multi-threading gains almost no speed over a single processor. It's darn clumsy to fork in part because it takes so long for python to unwind it's stack when a job exits. And it's never written from the ground up to be thread safe.
Fortran95 and 2003 have huge potentials for multi-cores since vector ops and out of order loops are part of the core language, the memory order of arrays can be favorable to vector ops, and the developers have been thinking about High performance Computing as a driver.
however neither fortran95 or pyhton has notions of Syncronizing and locking so all the parallelism is implicit not explicit. You'd rather have implict paralellism to be sure, but sometimes you need explicit control.
JAVA was written with threading in mind from the beginging. So it can potentially embrace the multi-core revolution that is coming more quickly than other languages.
Parent
Share
Re:Python+Fortran or JAVA+Groovy
Score:
, Interesting)
by
goombah99
( 560566 )
writes:
on Thursday April 24, 2008 @09:50PM (
#23193022
I'm not convinced Java's "synchronized" facilities are a significant improvement over Python's global interpreter lock.
Java gets a (somewhat) linear speed-up when you add cores. Python gets virtually zero and in some cases it loses over unthreaded. Big difference!
Parent
Share
Re:Python+Fortran or JAVA+Groovy
Score:
, Informative)
by
NovaX
( 37364 )
writes:
on Thursday April 24, 2008 @09:54PM (
#23193060
There are a number of differences, if I understand Python's giant lock approach correctly. They have basically adopted the threading model used in most early, single-processor operating systems (e.g. Linux, FreeBSD) where the system calls are protected by a shared lock. This works fine for single-processors, since multiple processes are implicitly serialized by task switching. However, as multiple processes run concurrently in hardware this immediately shows performance issues.
Java's
synchronized
keyword is a user-level mutex and not a single shared lock across the entire JVM. This means that data structures like HashMaps can use lock-splitting across buckets, or that threads executing independant code flows are not serialized across a single mutex boundary. With Java-5's support for CAS operations, more powerful locks and concurrency data structures are available. I have executed thousands of threads in a distributed master-worker fashion and, due to elegant lock semantics, have not suffered any performance issues due to synchronization. This means that Java is quite strong at both multi-core systems (where there are only a few CPUs) and distributed systems (where there are many).
I am personally a fan of an actor model (e.g. Erlang) for application developers and a lock model (e.g. C, Java) for infrastructure developers. I do not believe that the actor model works efficently enough to be used at the guts of a system, such as caches, where performance is critical. These are special areas that need a skilled hand, meaning that a lock model should be used sparingly in favor of an actor approach. This has been adopted for quite a long time, as message-based (queues) models are fairly standard in most large, distributed service-based applications.
Parent
Share
Re:C/C++ is dying!
Score:
, Interesting)
by
jd
( 1658 )
writes:
imipak@yaho[ ]om ['o.c' in gap]
on Thursday April 24, 2008 @06:22PM (
#23190768
Homepage
Journal
ColdFusion should be shot with a silver bullet, stabbed through the heart with a stake, be stuffed with garlic, and be buried at a crossroads at midnight in a holy water-filled lead coffin with elder signs on all sides, inside and out. Other than that, I have no idea why it ranks in the top 20.
Delphi and Pascal are other puzzlers. Pascal is great as a teaching language, but there are later iterations of that family of languages - Modula-2 and Modula-3 - that arguably provide better rigor if rigor is what you are after. And I see no obvious reason to use Pascal or related languages if you're not after truly rigorous code.
C seems to be holding ground, the slight loss seems to be within the fluctuations other languages that are holding steady are seeing. It's too powerful, too close to bare-metal programming and too close to the actual machine architecture to fade for some time yet. C++ might genuinely be losing ground - C# and D provide a lot of the power and object-orientedness of C++ but make an effort to learn from the complexity of C++. Personally, I suspect D might stand a better chance as C# is still very much tied to a single vendor in people's minds. I don't see C++ vanishing, rather I see them reaching some common point and staying there.
VB is quick-n-dirty, and it's popular because it's so easy to write something in it. If it ever became unlawful to have a website that was dangerously insecure or a hazard to Internet traffic (in much the same way cars have to be inspected every so often in some places to ensure it meets certain minimum safety standards) I imagine Visual Basic would lose appeal. Well, that or the EU eventually raising the fines to the point of driving Microsoft out of international competition.
Given that so much new scientific code is still produced in Fortran, whereas not much is really written in COBOL although a lot of legacy code is maintained in it, I'm surprised COBOL is there and Fortran is not. (Fortran is popular enough that there are TWO competing front-ends for GCC for it. There are open-source COBOL compilers, but as far as I know, all work has stopped on all of them. To me, that says something about the level of interest and serious usage.)
Parent
Share
Re:C/C++ is dying!
Score:
, Insightful)
by
billcopc
( 196330 )
writes:
vrillco@yahoo.com
on Thursday April 24, 2008 @09:11PM (
#23192718
Homepage
I agree with you on Coldfusion, simply because I'm forced to work with it on a daily basis. As a longtime "real" programmer, CF is an insult to my skill and experience, but sadly I need to eat.
Delphi though, slow down! Everyone keeps repeating how Pascal is a teaching language, yet it was my official language for many years. Back in the 90's I was developing commercial games with little more than Borland Pascal 7 and Turbo Assembler. I did the speedy bits in assembler, and the logic in Pascal. My development time was extremely short and my code was very reliable and reusable.
When Delphi happened, well honestly the first few versions stank, but I remember writing all sorts of apps in Delphi 4 (yes, even DirectX games). Delphi today has turned into a schizophrenic marketing clusterfuck thanks to Borland/Inprise/Codegear/TrendyNameOfTheMomentInc, but I think Delphi as language is just right for a large number of situations.
It's right in the sweet spot between useless VB and painful C, plus it compiles crazy fast and performs very respectably, given how easy it is to develop. In fact, its qualities closely resemble those of C#, only Delphi did it over 12 years ago. It's no coincidence, Microsoft hired the creator of Turbo Pascal, Anders Hejlsberg, to create C#, J++ and many key architectural features of
.NET. If Borland hadn't gone mental in the mid-90s,
.NET would not exist today, instead we'd have Borland's equivalent and people would be praising Delphi, just as they praise C# in today's reality. It would probably run a helluva lot faster too!
Parent
Share
Re:C/C++ is dying!
Score:
, Funny)
by
jd
( 1658 )
writes:
imipak@yaho[ ]om ['o.c' in gap]
on Thursday April 24, 2008 @08:25PM (
#23192324
Homepage
Journal
Answer to Coldfusion's longevity - MySpace.
Ok, build crossroads at the bottom of a deep oceanic trench, bury ColdFusion as specified there along with MySpace, plate the bottom of the trench with Osmium before filling it with molten rock from the planet Mercury. You gotta take these menaces seriously.
Parent
Share
Re:C/C++ is dying!
Score:
, Interesting)
by
CastrTroy
( 595695 )
writes:
on Thursday April 24, 2008 @10:22PM (
#23193276
Well, as a VB developer, you have to remember that when people talk about VB now, they are talking about VB.Net. Which is exactly the same as C#, with a different syntax. Comparing VB.Net to VBScript or even VB6 is like comparing Java with Javascript. VB gets a bad name because it used to be pretty bad, and there's a lot of non-programmers using it to do a lot of stuff they aren't qualified to do, and messing it up royally. But that doesn't mean VB.Net is a terrible language. I wouldn't fault PHP for all the insecure newbie websites created with PHP.
Parent
Share
Re:C/C++ is dying!
Score:
, Interesting)
by
GeckoX
( 259575 )
writes:
on Friday April 25, 2008 @09:35AM (
#23196946
Agreed. VB.Net is not VB. Still a tad behind C# for language features, but barely. Worked in C# for the last 4 years at my last job, and dreaded having to use VB.Net at my new place of work. But now that I have been for a year, other than syntax, there's really zero difference between the two. Catch is to turn off the 'features' that let you write more vb6ish bastardized code. Make sure Option Strict and Option Explicit are on, and throwout the Microsoft.VisualBasic namespace, and you're good to go. One great benefit over it is that the 'perceived' challenge for a VB6 developer in switching to
.Net is greatly removed when they can be introduced to VB.Net rather than C#. I've mentored people that would never have attempted anything in C# in moving to VB.Net with great success.
I still prefer the syntax of C#, but that's mostly just personal preference.
Parent
Share
Re:C/C++ is dying!
Score:
, Funny)
by
The Master Control P
( 655590 )
writes:
ejkeever@@@nerdshack...com
on Friday April 25, 2008 @12:30AM (
#23194076
I know that I'm not the only one who read that link as "Dalek Scientific."
Which would be the most goddamn
awesome
name for a scientific supply company ever.
Parent
Share
Re:C/C++ is dying!
Score:
, Funny)
by
LabRat007
( 765435 )
writes:
on Thursday April 24, 2008 @05:05PM (
#23189438
Homepage
For those of you who can't open the page...
1. Java
2. C
4. PHP
5. C++
6. Perl
7. Python
8. C#
9. Ruby
10.Delphi
Please note, there is no language in the 3rd position this year. Seriously.
Parent
Share
Re:C/C++ is dying!
Score:
, Interesting)
by
lgw
( 121541 )
writes:
on Thursday April 24, 2008 @05:30PM (
#23189800
Journal
Do they somewhere discriminate between VB and VB.Net? Claiming that VB is not even a programming language is
... probably reasonable. VB.Net is just C# without curly braces, however.
Parent
Share
Re:C/C++ is dying!
Score:
, Funny)
by
Bozdune
( 68800 )
writes:
on Thursday April 24, 2008 @06:57PM (
#23191264
VB not a programming language?
Yes, because:
1) Only noobs and losers use VB.
2) It's not object-oriented and I took an OO class and they said everything should be OO or it sucks, so VB sucks.
3) I have to create global variables sometimes and I was taught that I should never use global variables for anything because they're bad.
4) Only noobs and losers use VB.
5) VB lets me create classes but they don't work the way classes are supposed to so I hate them.
6) I don't like VB controls, they're ugly, I like to make little hexagonal corners and stuff and piss away weeks of development on cute little clickable thingies.
7) Only noobs and losers use VB.
8) I don't like VB because writing Windows applications should be really hard.
9) "Hello, world" only takes one line, and that can't be right, because I learned in Java class that it should take pages and pages of setup code and stuff.
10) Only noobs and losers use VB.
11) Some idiot can build a simple windows app in about 30 seconds and it's not fair, that same app took me two weeks in C++ class.
12) The dweebs in Accounting are building VB apps and they shouldn't be programming, they don't know what they're doing.
13) Only noobs and losers use VB.
14) I heard Ruby is where it's at, I only want Ruby jobs now because it's kewl.
15) I heard that VB is wicked slow, but then I found out it compiles and stuff which totally isn't fair.
16) Only noobs and losers use VB.
Parent
Share
Re:C/C++ is dying!
Score:
, Funny)
by
smellotron
( 1039250 )
writes:
on Thursday April 24, 2008 @07:59PM (
#23192074
If they really are dying... I'd say only one thing:
FINALLY!
There is no
finally
! That's what destructors and the
RAII idiom
[hackcraft.net] are for, duh.
Parent
Share
Always be there
Score:
, Insightful)
by
ohxten
( 1248800 )
writes:
on Thursday April 24, 2008 @04:35PM (
#23188972
Homepage
C/C++ will always be there. Period. Just look at all of the C/C++ projects on SourceForge.
New languages will come and go, but C/C++ are just too stable to go so quickly.
Share
Re:Always be there
Score:
, Insightful)
by
KlomDark
( 6370 )
writes:
on Thursday April 24, 2008 @04:38PM (
#23189016
Homepage
Journal
Yep, it'll be right out there with all the Cobol projects on Sourceforge...
Parent
Share
Re:Always be there
Score:
, Funny)
by
wkring
( 728574 )
writes:
on Thursday April 24, 2008 @05:13PM (
#23189536
Cobol projects go on the Share tape.
Parent
Share
That's a broken way to think of it
Score:
, Insightful)
by
krog
( 25663 )
writes:
on Thursday April 24, 2008 @04:47PM (
#23189158
Homepage
C and C++ are entrenched, but it was never their stability which caused it. Computer languages are theoretical; one valid language is just as 'stable' as another. The real issue of stability lies in the implementation, and that is language-independent.
Anyway, C is going to stick around because it is the most superb assembly language developed by man. C++ will of course stay around as well, but by modern standards it fails as a "high-level" language. The ceiling got a lot higher in the intervening 20 years; other languages reach much higher in a very useful way. I'd be happy to see less C++.
Parent
Share
Re:That's a broken way to think of it
Score:
, Insightful)
by
SatanicPuppy
( 611928 )
writes:
Satanicpuppy.gmail@com
on Thursday April 24, 2008 @05:07PM (
#23189460
Journal
I'm not sure C is up to the multithreading/ multiprocessor support that is going to be required as processors keep shifting from single core to multicore architectures...I know it can be done, but C is hard to program for a single core...Multicore support may take it over the edge.
Mind you, I don't think anything else is really set up for it either (Erlang?) but that's going to be the next big challenge.
Parent
Share
Re:That's a broken way to think of it
Score:
, Insightful)
by
Sloppy
( 14984 )
writes:
on Thursday April 24, 2008 @05:20PM (
#23189660
Homepage
Journal
Mind you, I don't think anything else is really set up for it either (Erlang?) but that's going to be the next big challenge.
Whatever it is, its compiler and low-level libraries will be written in C.
Parent
Share
Re:That's a broken way to think of it
Score:
, Insightful)
by
Nursie
( 632944 )
writes:
on Thursday April 24, 2008 @06:39PM (
#23191012
Jesus christ there's another one....
C has been doing multi process for decades, and multi thread for a decade or more.
It's used in commercial apps all over the world.
How many times -
threads and parallelism have been with us for years. Just because games haven't been threaded doesn't mean the rest of the world hasn't been doing it, and doing it well for a long time
Look up pthreads sometime.
Seriously, threaded processing in C is damn simple.
Parent
Share
Mods on crack
Score:
, Informative)
by
Nursie
( 632944 )
writes:
on Thursday April 24, 2008 @07:05PM (
#23191390
Troll?
Annoyed, enflamed perhaps, but Troll?
Sorry but it's a pet hate of mine that here on slashdot, which is supposed to be a forward looking tech board, that people still regularly espouse the view that threaded programming is something either still in development, too complex for ordinary mortals, or only applicable in a few scientific arenas.
It's just thoroughly incorrect. Industry and open source have been doing threading for years. Please can we lose this myth.
And to bring the post back on topic - pthreads in C will do it all nicely. Hell, even MS VC++ 6.0 (almost 10 years old?) will compile your multithreaded Windows C app.
I'd also lik to express suprise at the title of this article. C is losing popularity at the same position as last year, number 2? OK, it'll fizzle out any day now, I believe you.
I think my job's safe for now.
Parent
Share
Re:That's a broken way to think of it
Score:
, Informative)
by
wmshub
( 25291 )
writes:
on Thursday April 24, 2008 @05:21PM (
#23189684
Homepage
Journal
Duh.
I think the parent was implying that C often directly maps into assembly language, and he's right. As an embedded programmer, one of the benefits of C is that, other than register selection, I can often tell you exactly what assembly statements will be emitted by a chunk of C code. Often I do use C as a shorthand for assembly.
Nobody who knows the term "assembly language" will think that C is one. But it's a lot closer than you might think.
Parent
Share
Re:That's a broken way to think of it
Score:
, Insightful)
by
lgw
( 121541 )
writes:
on Thursday April 24, 2008 @05:41PM (
#23190022
Journal
As someone who programmed in assembly for 5 yeas professionally, let me say: C is a crappy assembly language. It has a crappy macro language, and I'm often left guessing what the compiler will do with my C code, especially on an unfamiliar platform.
Is an int 32 or 64 bits? I had better compile a test program and fire up a debugger to find out. OK, since there's no C standard type for "32 bit int", what works on this compiler? Maybe INT32 is defined somewhere?
And don't get me started on implicit conversion.
Parent
Share
Re:That's a broken way to think of it
Score:
, Informative)
by
Schraegstrichpunkt
( 931443 )
writes:
on Thursday April 24, 2008 @05:51PM (
#23190218
Homepage
OK, since there's no C standard type for "32 bit int", what works on this compiler?
C99 fixed that:
#include
, then use either
uint32_t
or
int32_t
Parent
Share
Yes it is
Score:
, Insightful)
by
Weaselmancer
( 533834 )
writes:
on Thursday April 24, 2008 @05:26PM (
#23189754
It's just slightly higher level. A C compiler outputs assembly code - that's the whole point of a C compiler. Think of C as the worlds greatest macro processor for assembly.
That's why most compilers have some sort of ASM pragma - so you can inject your assembly into the code if you feel the compiler is doing a poor job of it.
That's also why you'll never find a faster language. And that's why it'll never go away.
Parent
Share
Re:Yes it is
Score:
, Funny)
by
Weaselmancer
( 533834 )
writes:
on Thursday April 24, 2008 @06:02PM (
#23190428
Best laugh of the day - thank you. =)
Hey, you've given me an idea though. You know what would be even faster? Now...don't stop me until you hear me out, okay?
If Java is faster than C, we should rewrite the Java VM...in Java! Interpreted code running in an interpreter...that is *also* interpreted!
Just think of the speed increase! It would be like using uranium to fuel the space shuttle! Awesome
multiplied by awesome.
Parent
Share
Re:Always be there
Score:
, Insightful)
by
fyngyrz
( 762201 )
writes:
on Thursday April 24, 2008 @04:47PM (
#23189160
Homepage
Journal
C is perfectly capable of extremely high-quality memory management with significant ease-of-use. However, you get to create that facility, or of course you can utilize someone else's design if you can locate one that fits your API needs, budget and time frame.
For instance, years ago I faced this issue and wrote a module that ensures there are no leaks in any part of an application I write; I also get over-run and under-run detection, named segments, dual-free attempt capture, memory usage reporting, and more. I have debug and end-user levels for the code so that during development, I get enormous detail, while the end user doesn't see that unless I specifically turn it on for them.
I have both pool and chunk level management; I have both pool and individual "free" levels; all of this in very few K indeed.
C is the perfect language to implement memory management in, in fact, because it has perfect fine-grained control over memory.
That goes for other things as well; C is highly capable if you need to build in certain types of OO; objects with built-in methods and variables can be crafted in seconds, with no waste at all; uniform list handling can be crafted (and is an interesting and useful programming exercise.)
C *could* go away as a result of a generation of programmers who really don't know how to deal with such things, but I think it would be a real loss if it happened. The up side is that it'll take a while. There's a whole generation of us who know C quite well, and we're nowhere near dead yet.
;-)
Parent
Share
Re:Always be there
Score:
, Insightful)
by
afabbro
( 33948 )
writes:
on Thursday April 24, 2008 @05:08PM (
#23189470
Homepage
However, you get to create that facility
s/get to/must/
Seriously, most people want to sit down and write the logic for their application, not invent (or even copy-paste) memory management schemes.
Parent
Share
Re:Always be there
Score:
, Insightful)
by
fyngyrz
( 762201 )
writes:
on Thursday April 24, 2008 @05:44PM (
#23190084
Homepage
Journal
Seriously, most people want to sit down and write the logic for their application, not invent (or even copy-paste) memory management schemes.
Yes, I understand that perfectly. I'm a huge fan of Python for that very reason.
However, in C, writing memory management only needs to be done once; while writing the "logic for the[ir] application" is done many times. Consequently, the apparent load of writing memory management is much lighter than one might initially recognize. Or to put it another way, once it's done, it's done and represents no load at all.
Further, there are huge advantages to having 100% control over the memory management of your application; speed advantages, fewer wasted/tied-up resources, and all the downhill consequences of those things -- if you don't waste resources, they're available for the user, or for other aspects of your programs. Likewise, if you get things done faster, more CPU is available elsewhere.
Another thing: Depending on an external agency to manage your resources is a two-edged sword. If there are bugs in *your* code, you can fix them as fast as you are competent to do so. Considering you wrote it in the first place, the presumption that you are competent to fix it is usually on target.
If there are bugs in an external agency, you typically get to report them... and wait, bugs happily chewing on the users of your applications, until said external agency gets around to fixing whatever it was. If indeed they ever do.
Same thing goes for list management, etc. Write it once, learn all about it (which is interesting AND increases your Leet Skillz) and now you have a generally useful tool that is as fast as you can make it, totally amenable to fixes and updates, and invulnerable to the ass-draggery of outside influences. I have used my list management module in AI apps, ray tracers, image processing, file management, and even in dialogs to control layer types in various (what I think are) clever ways. I have huge confidence in it, but, should it turn out to be broken... I could fix it in minutes. At which point every app I've written gains ground, all my customers win, etc.
There's something else that has always remained in the back of my mind. As languages get more sophisticated, there is a trend for them to generate much larger and much slower resulting applications. It isn't uniform, and it depends on what you're doing, compilers as compared to interpreters, etc., but the trend itself is pretty clear. For instance, a Python app seems small, until you realize that the Python interpreter is loaded for your one-liner. C++ apps tend to be huge compared to C apps. And so on.
This trend - basically - tracks the increasing availability of memory and CPU power. Seems reasonable enough. But the funny thing is, if you take an app that was designed to run at adequate speed on hardware from, say, 1992, keep the technology behind the app the same if you update it - that is, keep writing efficient C and so on - then the increase in memory and CPU resources serve to turn the app into some kind of blistering miracle implementation instead of the run of the mill performance you get from depending on the latest and greatest HLL with garbage collection, the implicit inclusion of module after module of object-oriented processing and modeling, data hiding, etc., etc.
Directly related to this is the fact that if you attempt a modern task - such as an image manipulation system - in a modern language, you, as the programmer, can be significantly enabled by the language; that is, you can be done sooner, and you can have a lot of things done, too, many coming along for the ride, for "free." Garbage collection / memory management being one such thing. But if you approach the task using C, which is basically capable of creating as fast an application as you are capable of writing, it is so close to assembly, while we can certainly agree up front it'll take you longer, the end result coul
Read the rest of this comment...
Parent
Share
Re:Always be there
Score:
, Insightful)
by
jsebrech
( 525647 )
writes:
on Thursday April 24, 2008 @06:04PM (
#23190460
But if you approach the task using C, which is basically capable of creating as fast an application as you are capable of writing, it is so close to assembly, while we can certainly agree up front it'll take you longer, the end result could be a lot faster and a lot more capable of efficiently managing the user's resources than that which you might create using a modern HLL.
Agreed 100 percent. If you write it in C, you can make it run faster with lower resources, but you will spend a lot more time creating, debugging and maintaining it.
Most software simply doesn't need to be that fast. The performance sensitive pieces of code are in database queries (C code), or disk operations (C code), or math operations (C code). Modern garbage collectors also are proven, they're fast, they're reliable. It doesn't make sense for the majority of classes of software, from a cost vs. gain perspective, to use C for the job.
Parent
Share
Re:Always be there
Score:
, Interesting)
by
fyngyrz
( 762201 )
writes:
on Thursday April 24, 2008 @06:34PM (
#23190936
Homepage
Journal
...but you will spend a lot more time creating, debugging and maintaining it.
Hmm. Creating, probably so. You're writing smaller steps on a per-keystroke basis, so it's pretty much a given.
Debugging and maintaining, however, are issues more predicated upon design skills than the language used. From things entirely outside the code's executing domain (like comments and other documentation) to things inside (structures and algorithms), correctness (from which depends debugging), reliability (from which depends maintainance) and completeness / applicability (from which also depend maintainance), all these things are independent of the language, except in very minor and essentially irrelevant ways.
I would argue that coding in an HLL does not improve these latter things. However, coding in C brings you extremely close to both the problem(s), and the solution(s) you decide to implement without taking you that last troublesome step down into assembler, where you lose platform independence. I think that is a uniformly positive set of consequences to enjoy as a result of spending that extra time.
It doesn't make sense for the majority of classes of software, from a cost vs. gain perspective, to use C for the job.
Well, we'll have to agree to disagree here. Wasting resources can have unpredictably large effects, such as pushing a system over the edge between running in memory and beginning to swap. The more you waste, the more likely you are to cause such problems.
The fact is, running the user out of resources for no reason other than saving small amounts of my time up front is outside the bounds I am willing to go. The gains at the user's end, especially when multiplied by many users across many invocations, are likely to be substantial. Consequently, the investment on my end is almost certain to be small by comparison, even if it is actually many of my hours.
As a user, I run into this all the time. If I start a certain application, it typically takes quite some time to start. It's the "industry standard", but frankly, it runs like a pig in hip deep dung on
every
startup. And it eats memory like crazy, even the executable is 4x larger than other apps that do the same thing, but which -- notably -- aren't the "industry standard." So I make the choice, as a user, to use the other apps for all tasks that are achievable either way (and as it turns out, I *very* rarely have to start the industry standard program.) I want my memory to be used for data, not for a bloated application; and I want my time used in working on that data, not waiting to count and register every plugin or aux feature in the system every time the application starts.
The problem is that from the programmer's perspective, "time and effort" are not even slightly the same as they are from the user's perspective. For my part, I consider it an ethical "must-do" to consider the user's perspective as the primary one driving the design. Both from the viewpoint that their resources are not "mine to waste" just because they have extended me the courtesy of allowing my software to run in their machine, but also from the viewpoint that any supposedly "extra" time I spend, I spend once; any time I cost the users unnecessarily, I extract that cost from
every user
, and
every time
the software is run.
Parent
Share
Re:Always be there
Score:
, Insightful)
by
lena_10326
( 1100441 )
writes:
on Thursday April 24, 2008 @07:56PM (
#23192050
Homepage
However, in C, writing memory management only needs to be done once; while writing the "logic for the[ir] application" is done many times. Consequently, the apparent load of writing memory management is much lighter than one might initially recognize. Or to put it another way, once it's done, it's done and represents no load at all.
I don't believe that is true at all. One huge reason for building a memory management scheme is to tailor it to a specific algorithm, which is bound to a particular application. Optimization for allocating small chunks (bytes to kilobytes) can be very different compared to allocating extremely large chunks (megabytes to gigabytes), or variable sized versus fixed size, or read/writes with sequential access versus random access, or low access frequency versus high access frequency, or multi-threaded versus serial. These are all intricately bound to the overall application algorithm and can yield extremely different solutions given a particular problem. It's simply not possible to write a general allocation scheme that is fully optimized for every type of problem. I've experienced this in real world projects.
Another thing: Depending on an external agency to manage your resources is a two-edged sword. If there are bugs in *your* code, you can fix them as fast as you are competent to do so. Considering you wrote it in the first place, the presumption that you are competent to fix it is usually on target.
It's rare that the original developer stays on the project for its lifetime of usage. In fact, I've never seen that happen. People quit, get fired, get promoted, or move onto new projects. When the sole hot-shot in the organization writes a complex codebase, it places a future burden on the lesser experienced team that may inherit it. Maintenance is always more expensive than original development.
If there are bugs in *your* code, you can fix them as fast as you are competent to do so. Considering you wrote it in the first place, the presumption that you are competent to fix it is usually on target... [CUT]... I have huge confidence in it, but, should it turn out to be broken... I could fix it in minutes
I don't believe that for a second. I've seen sneaky bugs in C code plague development teams for days and on a few occasions a week. You're either vastly underestimating or are totally unaware of very well hidden bugs in your code.
But the funny thing is, if you take an app that was designed to run at adequate speed on hardware from, say, 1992, keep the technology behind the app the same if you update it - that is, keep writing efficient C and so on - then the increase in memory and CPU resources serve to turn the app into some kind of blistering miracle implementation instead of the run of the mill performance you get from depending on the latest and greatest HLL with garbage collection
99+% of the time with general problems, I/O is the bottleneck. For those cases, a C application might run 1% faster on newer hardware, given equivalent I/O hardware (same model/make of drive or network). In the
vast
majority of cases, the effort is simply not worth it. It's far more expensive to pay your salary to build and maintain that codebase than it is to simply purchase a beefier machine. The former is a repeating expense, the latter is a one time expense. Business managers love the latter, not the former.
I do agree that if your domain consists of highly CPU bound computational algorithms that don't require frequent HD or network access, then your approach will scale well with the faster hardware.; however, I don't think advocating it as a baseline approach for all or most projects makes any sense. It is far more work and causes more maintenance headaches than you're describing.
Parent
Share
Re:Always be there
Score:
, Interesting)
by
Carewolf
( 581105 )
writes:
on Friday April 25, 2008 @08:44AM (
#23196486
Homepage
C/C++ might give you 1% CPU speed-up, but by fine-tuning the memory allocations, the block allocations on the disk and the way you communicate with the I/O devices it can give you a speed-up on I/O operations that is not available in any of the modern toy languages.
Parent
Share
Re:Always be there
Score:
, Insightful)
by
SanityInAnarchy
( 655584 )
writes:
ninja@slaphack.com
on Thursday April 24, 2008 @04:54PM (
#23189278
Journal
Assembly will always be there. Period.
That doesn't mean it will be particularly popular, or very likely that you can get a job in doing nothing but assembly programming.
Really, with C especially, just about every advantage it has over more modern languages are advantages that C itself has over assembly. Assembly is still needed, but no one in their right mind would, say, write an entire OS in assembly.
The day is coming when no one in their right mind will write an entire OS in C or C++, or even an entire OS kernel -- depending on your definition of "kernel".
Parent
Share
Re:Always be there
Score:
, Insightful)
by
Chris Burke
( 6130 )
writes:
on Thursday April 24, 2008 @05:56PM (
#23190314
Homepage
That makes C++ a lot better for application writing, but not necessarily for OS writing. The kinds of resources being managed in a kernel usually aren't the kind that are easily managed through "scope".
One criticism of C++ is that by automatically handling the destruction of objects when they go out of scope, it can lead to a feeling of false security to programmers who assume that because their objects are destroyed, that all resources are properly freed. The possibility for leaks is quite significantly there, in the design of constructors and destructors and anything that uses a pointer. Though while not always easy, having to make no mistakes in any of your destructors for any class is a heck of a lot easier than never making a mistake on any individual object's deallocation as in C.
By the same token, it's quite possible to have "leaks" in Java or C#, simply by having extraneous references to no-longer needed objects laying around in objects that are themselves still referenced.
I'd still take C++ any day over C for a big application.
Parent
Share
The lower levels will always be there
Score:
, Insightful)
by
CustomDesigned
( 250089 )
writes:
I do most of my work in Python and Java now. However, I often need to write in C/C++ to create JNI modules for Java or extension modules for Python. Wrapping low level (use 3rd party library) and performance intensive stuff for control via a higher level language is very productive. (C++ is handy for JNI, C is better for Python.) Furthermore, I even occasionally write small functions in assembler for C - usually to utilize a specialized instruction.
Re:Always be there
Score:
, Funny)
by
rishistar
( 662278 )
writes:
on Thursday April 24, 2008 @05:08PM (
#23189468
Homepage
C/C++ will always be there. Semi-Colon.
There fixed that for you.
Parent
Share
Re:not so..
Score:
, Insightful)
by
ArcherB
( 796902 )
writes:
on Thursday April 24, 2008 @04:51PM (
#23189246
Journal
when we have internet that is as fast as cpu response times c and c++ will go the way of the dinosaur and the internet will be your main application platform and gaming platform, meaning game over for c and c++.
As long as computers need an OS, C/C++ will be in wide use. All major OS's are written in C/C++ and will be for the foreseeable future.
Parent
Share
Re:not so..
Score:
, Insightful)
by
gangien
( 151940 )
writes:
on Thursday April 24, 2008 @05:25PM (
#23189730
Homepage
I wouldn't be
so
[microsoft.com]
sure
[jnode.org].
Parent
Share
Re:
Score:
, Interesting)
by
peragrin
( 659227 )
writes:
I hate to say so, but MSFT's Singularity and now others(including Open Source versions) are doing a core OS in C# and
.NET. It is something innovated from MSFT.
It will take time, but it's well within the foreseeable future.
Re:not so..
Score:
, Funny)
by
RiotingPacifist
( 1228016 )
writes:
on Thursday April 24, 2008 @05:45PM (
#23190110
The day the linux kernel is coded in anything other than C, is the day i after i install duke nukem forever on hurd.
Parent
Share
Re:not so..
Score:
, Informative)
by
andersbergh
( 884714 )
writes:
on Thursday April 24, 2008 @06:32PM (
#23190904
Well, the kernel definitely isn't written in Objective-C, here are the languages they use:
C for the kernel
Embedded C++ for the drivers (IO Kit)
But many of the applications that make up OS X however are written in Objective-C.
Parent
Share
Re:Objective C
Score:
, Informative)
by
jeremyp
( 130771 )
writes:
on Thursday April 24, 2008 @07:09PM (
#23191454
Homepage
Journal
The Mac OS X kernel is entirely written in C except for the bits that have to be written in assembler.
The preferred run time for graphical applications is Objective C but I'm willing to bet that the low level graphics are done in C.
And Objective C is the bastard son of C and Smalltalk (but it's still my favourite programming language). It's probably equally closely related to Java and C++.
Parent
Share
Speed of light limits Internet speeds
Score:
, Insightful)
by
tepples
( 727027 )
writes:
{tepples} {at} {gmail.com}
on Thursday April 24, 2008 @05:40PM (
#23190000
Homepage
Journal
when we have internet that is as fast as cpu response times
Unlikely. Even hard drives are faster than routing a ping from London to Tokyo and back.
Parent
Share
Re:not so..
Score:
, Funny)
by
Anonymous Coward
writes:
on Thursday April 24, 2008 @04:53PM (
#23189272
Coming from someone who can't handle the concept of a contraction, it doesn't carry the weight you think it does.
Parent
Share
Visual Basic at #3?
Score:
, Funny)
by
eldavojohn
( 898314 )
writes:
eldavojohn.gmail@com
on Thursday April 24, 2008 @04:35PM (
#23188976
Journal
I can handle C and C++ losing ground.
But did anyone else find Visual Basic rising two spots to #3 past PHP & C++ to be a sure sign of the apocalypse?
(Visual) Basic 11.699% +3.42% A
Could someone reassure me that's a mistake before I go home to sit down with a bottle of Jack Daniels and a revolver with a single bullet in it?
Share
Re:
Score:
, Informative)
by
kitgerrits
( 1034262 )
writes:
This might have something to do with this PowerShell thing: ccontrolling the O/S through the use of VB scripts.
It's not exactly the Bourne Shell, but it does show promise.
As Windows admins look at scripting the boring stuff, they will need to learn VB...
Re:
Score:
, Insightful)
by
kitgerrits
( 1034262 )
writes:
Shows what I know about PowerShell...
Being a UN*X admin and a reasonably-competent scripter, I tried looking into it and my brain had trouble grasping how this is supposed to be a shell.
From what I can see, Bourne and other UN*X shells are stream-oriented and PS seems object-oriented.
I see LDAP as a flat text-based database with Organizational Units, not as a magical forest with trees, domains and groups.
This is, most likely, because UN*X admins are used to modifying and/or generating configuration files,
Re:Visual Basic at #3?
Score:
, Funny)
by
Hoplite3
( 671379 )
writes:
on Thursday April 24, 2008 @04:45PM (
#23189122
Not a mistake. But if I could make a suggestion, it would be to upgrade your burbon to Booker's. You won't need that money later.
Parent
Share
Re:Visual Basic at #3?
Score:
, Insightful)
by
thermian
( 1267986 )
writes:
on Thursday April 24, 2008 @04:50PM (
#23189228
I've been C coding for years, and I have to say, even though I like it, the number of things that I can do more easily with, say, Python, is getting larger.
I suspect that soon all I will use C for is writing shared libraries that I can call from some other language.
I wish people would stop banging on about C's memory problems. C has *no* memory management problem. It has no memory management at all, um, I mean, you just have to be careful when writing your code.
C is fast, seriously fast even. For that reason alone it will always have a place. I shouldn't think there will be many coders who only use C left soon though, because the job market for pure C programmers is pretty small these days.
Parent
Share
Re:
Score:
, Funny)
by
Chris Burke
( 6130 )
writes:
Single, not silver.
Gah, damnit!
I guess now you know why I'm such a terrible werewolf hunter that I have to try to do it by tricking them into committing suicide over the internet.
Re:Visual Basic at #3?
Score:
, Informative)
by
SatanicPuppy
( 611928 )
writes:
Satanicpuppy.gmail@com
on Thursday April 24, 2008 @05:18PM (
#23189606
Journal
The methodology page is
here
[tiobe.com].
I don't know. A lot of it depends on what applications businesses are using; a few big companies pushing large Delphi projects could make a big difference.
I think Javascript is also hampered by the fact that there aren't all that many different apps, and that a lot of people
do
view it as a semi-essential skill, so it gets less play. You don't see HTML up there anywhere.
Parent
Share
Managed code is the way to go
Score:
, Interesting)
by
KlomDark
( 6370 )
writes:
on Thursday April 24, 2008 @04:36PM (
#23188984
Homepage
Journal
I haven't written a line of code in C or C++ since I started with C# - C/C++ syntax with no tracking of memory (I detest tracking memory!!) except in the more obscure situations. Both
.NET and Mono allow for C#, so you're not stuck on one platform.
Share
Re:
Score:
, Interesting)
by
QuantumG
( 50515 )
writes:
Lately I've found the biggest advantage of using C# over C++ is compile time. If I change a header file in C++, that's it, I'm off to make coffee, but with C# you can change just about anything and the code is recompiled in seconds.
Now if only the native code generation for C# wasn't so pitiful and unsupported.
Re:
Score:
, Insightful)
by
Yokaze
( 70883 )
writes:
That is hardly a conceptual problem of the language C++, but more one of the toolchain and/or ABI, and can be improved on by rewriting the old GNU linker in
C++
:)
[sourceware.org]. And maybe someday the GNU binutils will gain incremental linking.
More critical is that the grammar of C++
is undecidable.
[yosefk.com]
Re:
Score:
, Funny)
by
Chris Burke
( 6130 )
writes:
That was a rhetorical question, by the way.
Of course, because we both know the answer is emacs.
Re:Managed code is the way to go
Score:
, Insightful)
by
pclminion
( 145572 )
writes:
on Thursday April 24, 2008 @04:53PM (
#23189274
I'm not sure why you feel you need to "track memory" in C++. I did an analysis of all the code I've written a year or so ago, and I found that there is approximately one usage of a pointer in every 5700 lines of code (the way I write it, at least).
We have this great stuff called containers and RAII. And for when you absolutely must, must use a pointer, you have boost::scoped_ptr and boost::shared_ptr. I have not coded a memory leak or buffer overrun in C++ in over six years.
The best way to not leak memory is to never allocate it in the first place. The best way to avoid overflowing raw buffers is to not use raw buffers. Use the containers. When you think you can't, think harder.
Parent
Share
Re:
Score:
, Informative)
by
sconeu
( 64226 )
writes:
Not to mention that scoped_ptr and shared_ptr are in the next iteration of the Standard (well, shared_ptr for sure, can't remember about scoped_ptr).
Re:Managed code is the way to go
Score:
, Insightful)
by
Kevin Stevens
( 227724 )
writes:
kevstev@gmailDEBIAN.com minus distro
on Thursday April 24, 2008 @06:13PM (
#23190604
If you use the facilities provided by the STL and BOOST (most notably shared_ptr), C++ is not a whole lot different than Java these days. Java went a little too far in my opinion on being nice to the programmers while giving up performance. Modern C++ hits the sweet spot in my opinion.
If only the standards committee could get off its arse and progress as quickly as BOOST does....
Parent
Share
Re:Managed code is the way to go
Score:
, Insightful)
by
PhrostyMcByte
( 589271 )
writes:
phrosty@gmail.com
on Thursday April 24, 2008 @05:11PM (
#23189508
Homepage
Garbage collection is surely a factor in them losing ground, but I think the main reason is simple: library support.
Java and
.NET have huge well-designed frameworks behind them. You can get things done really fast. What does C have? A bunch of separate libraries all with different conventions. C++ is a little better with a more useful standard library and Boost, but it still doesn't have anywhere near the infrastructure Java and
.NET have.
Parent
Share
Hammers and screwdrivers
Score:
, Insightful)
by
Weaselmancer
( 533834 )
writes:
on Thursday April 24, 2008 @05:35PM (
#23189898
I haven't written a line of code in C or C++ since I started with C#
That says nothing about those languages. All that says anything about is your job.
I write drivers, so I could make the opposite statement. Doesn't say anything about the relative merits of one language versus another though. All it says is that I'm in an environment where C makes more sense.
In summary: A hammer is best when your problem is a nail, and a screwdriver is best when your problem is a screw.
Parent
Share
Re:Hammers and screwdrivers
Score:
, Funny)
by
Chris Burke
( 6130 )
writes:
on Thursday April 24, 2008 @06:01PM (
#23190408
Homepage
In summary: A hammer is best when your problem is a nail, and a screwdriver is best when your problem is a screw.
I also find screwdrivers to be a very good solution when my problem is sobriety, and maybe Vitamin C deficiency.
Parent
Share
Dying...not hardly
Score:
, Insightful)
by
PalmKiller
( 174161 )
writes:
on Thursday April 24, 2008 @04:39PM (
#23189026
Homepage
I know I am gonna get flamed for this, but they said web programming, like its the only game out there. Sure its not web 2.0 friendly, and sure most web script kiddies don't use it...mainly because it don't hold their hand, but its far from dead when your are needing to squeeze every last ounce of power out of your hardware, or even that other 25-30% of it.
Share
What about desktop presence?
Score:
, Insightful)
by
Noodles
( 39504 )
writes:
on Thursday April 24, 2008 @04:41PM (
#23189048
I develop desktop application software. Right now I wouldn't think about using anything else but C++.
Share
Wow
Score:
, Funny)
by
ucblockhead
( 63650 )
writes:
on Thursday April 24, 2008 @04:43PM (
#23189080
Homepage
Journal
Down 0.77% in a year? Alert the presses!
Almost as bad as Jeff Atwood and Joel Spolsky calling them "dead languages" on their new podcast.
Share
For performance-critical code there is no choice
Score:
, Insightful)
by
SilentTristero
( 99253 )
writes:
on Thursday April 24, 2008 @04:46PM (
#23189136
For image processing (film/video), real-time audio or any serious signal processing, the overhead of anything but C/C++ is killer. It'll be news when Adobe After Effects or Autodesk Flame is rewritten in python.
Besides, measuring the popularity of a language by the size of its web presence is the worst kind of fallacious reasoning.
Share
Re:For performance-critical code there is no choic
Score:
, Insightful)
by
jameson
( 54982 )
writes:
on Thursday April 24, 2008 @05:18PM (
#23189610
Homepage
Hi,
Yes, some things need to be done in assembly or C in order to `stay competetive' or even just to remain within the realm of the possible. How much that is depends on your application and your platform.
So, systems programmers, you need not worry, your skills are always going to be needed for something.
But let's be honest here, 80% of the applications you can code entirely in Haskell or Prolog or Python or whatever fancy high-level language you may personally have come to love. And of the remaining 20%, you can usually still code 80% of the application in your favourite language and optimise the core 20% in C. (After profiling. Let me repeat that, AFTER profiling.)
Will it run faster and in less memory if you do it all in C? If you do it properly, sure. But that's not the question to ask. If you work commercially, ask for `what will be most profitable in the long run, while remaining ethical'. If you work free software projects, ask for `what will benefit people the most'.
Don't confuse the above questions with `what will satisfy my C(++) hacker ego the most'. And remember that it's not just about getting the code working and making it fast, it's about making the code robust; and in many cases it's also about making the code readable for whoever will maintain it after you.
Apologies for this rant; feel free to mod it down if you so desire, but you, dear fellow programmers, have had it coming for quite a while, as did I.
Parent
Share
C and C++ might die at different rates.
Score:
, Insightful)
by
jythie
( 914043 )
writes:
on Thursday April 24, 2008 @04:48PM (
#23189170
I could actually see C++ slowly going away over the next decade as it is replaced by other languages that fill the same needs but better.
C on the other hand is probably going to be around for a long, long time.
Share
Statistics
Score:
, Insightful)
by
Anonymous Coward
writes:
on Thursday April 24, 2008 @04:48PM (
#23189172
Measuring by internet web pages mentioning it? Can you say, "worthless statistic," kids? I write code that controls hardware. You bet it's C++. I write code that's IN the hardware. An interpreted language? Are you out of your damn mind? Do I
blog
about it? Don't be absurd. Am I generating "web presence" for it? Only on slashdot. Go away useless statistic.
Share
Different markets - different requirements
Score:
, Interesting)
by
ThePhilips
( 752041 )
writes:
on Thursday April 24, 2008 @04:55PM (
#23189294
Homepage
Journal
What I love about such studies is that they can confirm any theory you want.
Truth remains that every particular market has requirements which dictate selection of languages.
I doubt that telecom industry (as it is right now) would ever get over C or C++. Just like kernel or system libraries in anything else but C.
If you look at rise of Web - and pleiades of supporting it languages - then both C/C++ are out of question of course. Though again I can hardly imaging Apache or MySQL or PHP being written in anything else but C or C++.
Market for system and telecom programming is definitely shrinking - and consequently their languages. Other markets are now blooming - and their languages are becoming more popular.
My point is that the languages are complementing - they are not competing. After all you have to write hardware, firmware and OS first. Only then your beloved automated garbage collection has possibility to kicks in.
Share
Re:Different markets - different requirements
Score:
, Informative)
by
krog
( 25663 )
writes:
on Thursday April 24, 2008 @05:04PM (
#23189434
Homepage
Just a note -- Ericsson developed the
Erlang
[erlang.org] language with telecom-style reliability in mind, and using it they have brought to market products like ATM switches with 99.9999999% uptime (that's nine nines, under 40ms of downtime per year). Telecom isn't just C's domain anymore.
Parent
Share
There are two kinds of coders...
Score:
, Insightful)
by
Froze
( 398171 )
writes:
on Thursday April 24, 2008 @04:55PM (
#23189298
those who can code in binary and those who cant code.
OK, kidding aside.
There are those who write code so that a person can do something on a computer. In which case the users are comparatively slow and the high level languages give you a distinct advantage in development.
Then there are those who write code to make the computer do something, in which case the low level languages give you the ability to more effectively optimize how the computer interacts with itself, this is where languages like C, C++ really come into their own.
In the early days of computing it was all about the later, now its much more about the former, but the later will never go away. So the decrease is reasonable and IMHO does not represent a failing of the language, just a shift in the way computers are being used. I will be very surprised if the high level languages ever get widespread acceptance in the areas that require computational efficiency, ala computational physics, protein folding, etc.
Share
Fortran!
Score:
, Informative)
by
frogzilla
( 1229188 )
writes:
on Thursday April 24, 2008 @04:57PM (
#23189330
Fortran has been dead for ages but we still use it everyday on a variety of architectures. I know we're not the only ones. Many scientists still use it.
Share
Re:
Score:
, Funny)
by
Digi-John
( 692918 )
writes:
FORTRAN is fast as hell and lots of scientists know it already, so yeah, it's still got a lot of use over here in scientific computing. Software packages like LINPACK have been tweaked for decades to get really high performance. The thing is, people in scientific computing are less likely to sit around blogging and posting on
/. (I'm an exception, it seems) so their languages (FORTRAN and C, maybe some C++) don't get as much coverage as stuff like Ruby on Rails where you get 5 million postings on Digg every
Absolutely
Score:
, Funny)
by
DoofusOfDeath
( 636671 )
writes:
on Thursday April 24, 2008 @05:10PM (
#23189492
Are C and C++ Losing Ground?
Yes, but on the bright side, they lose ground about 1.5x faster than Java in most applications.
Share
We've replaced C/C++ with Python wherever possible
Score:
, Interesting)
by
Sarusa
( 104047 )
writes:
on Thursday April 24, 2008 @05:14PM (
#23189556
We have certainly replaced C/C++ with Python wherever we can. This is about 90% of our software. Except where C is absolutely needed (which is mostly just in our kernel/device driver stuff), the 10x faster Python development and far easier code maintenance just outweighs everything else. That the Python is much less prone to crashing for programs beyond tiny one-offs is another big positive (yes, yes, if you write perfect C/C++ and don't use glib you'll never crash either, but in practice this never happens).
In practice the speed difference doesn't matter for almost every application we've run into - we have a high speed network load tester in Python, which sounds ridiculous, but it works and it makes it insanely easier to add new tests or behaviors. If we ever hit a bottleneck, we just write a small C extension module and call that from the Python.
I'm saying Python here, but insert your higher level language of choice.
Share
bandwagonism
Score:
, Insightful)
by
epine
( 68316 )
writes:
on Thursday April 24, 2008 @05:16PM (
#23189584
I wouldn't say C or C++ is losing ground. They both continue to serve well in the niches they established.
Meanwhile, other segments of the pie are expanding, and few of these new applications are coded in C or C++. Does that mean C and C++ are losing ground?
There is no language out there that serves as a better C than C, or a better C++ than C++. The people who carp about C++ reject the C++ agenda, which is not to achieve supreme enlightenment, but to cope with any ugly reality you throw at it, across any scale of implementation.
For those who wish to gaze toward enlightenment, there is always Java. Enlightenment is on the other side of a thick, protective window, but my isn't the view pretty? I've yet to encounter an "enlightened" language that offers a doorway rather than a window seat. I would be first in line if the hatch ever opened.
The problem with C/C++ has long been that the number of programming projects far exceeds the number of people who have the particular aptitudes that C/C++ demand: those of us who don't need (or wish) to be protected from ourselves (or the guy programming next to us).
It's not economically practical to force programmers who don't have that temperament to begin with to fight a losing battle with self-imposed coding invariants. I'm glad these people have other language choices where they can thrive within the scope of their particular gifts. I don't feel my role is diminished by their successes.
For those of us who have gone to the trouble to cultivate hardcore program correctness skills, none of the supposed problems in the design of C or C++ are progress limiting factors, not within the zone of applications that demand a hardcore attitude toward program correctness.
It's the natural order of things that hardcore niches are soon vacated by those unsuited to thrive there, leaving behind a critical core of people who specialize in deep-down nastiness.
For example, it's not just anyone who maintains a root DNS server. I can say with some assurance that the person who does so did not earn his (or her) grey hairs by worrying about whether the implementation language supported automatic GC.
Let's take a metaphor from the security sector. Ten years ago, a perimeter firewall was considered a good security measure. This measure alone eliminated 99% of TCP/IP based remote exploits.
These days, most exploits are tunneled through http, or maybe I'm behind the times, and the true threat is now regarded to be some protocol tunneled within http.
Then some genius comes along and says "in the security sector, TCP/IP defenses are losing ground". Quoi? Actually, no one is out there dismantling their TCP/IP based perimeter firewall. It's continuing to do the same essential job as ever.
It's only the bandwagon that has picked up and moved camp. Yes, garbage collection and deep packet inspection are now all the rage. So it goes.
Why not go around saying that sexual reproduction is all the rage these days? Would that imply we could eliminate all the organisms that reproduce asexually, and the earth's ecology would continue to function? Hardly.
These new languages are soaking up much of the new code burden because these language are freed from having to cope with the nastiness at the extremes (small and large) that C/C++ have already taken care of.
I would almost say that defines a success criteria for a programming language: if it removes enough nastiness from the application space, that the next language that comes along is free to function on a higher plane of convenience. C/C++ have both earned their stripes. Which of these new languages will achieve as much?
Share
So what?
Score:
, Insightful)
by
menace3society
( 768451 )
writes:
on Thursday April 24, 2008 @05:21PM (
#23189680
FORTRAN, Lisp, and Cobol have all lost ground. BASIC and Pascal used to be the big dogs instead of also-rans, and if Ada ever had any ground in the first place, it lost that.
Even Perl isn't as popular as it used to be, now that other languages have started to fill its niche.
Times change, and it should be unsurprising that the dominant programming languages change along with it. Some day, Java, PHP, Visual Basic, Python, and Ruby will all be obsolescent as well. Thirty years ago, computers were vastly different than they are now. In another thirty years, there will have been another quantum leap (intended) in computing. Why should the languages we program them with remain the same?
Share
C++ - as garbage collected as you wanna be
Score:
, Interesting)
by
SpinyNorman
( 33776 )
writes:
on Thursday April 24, 2008 @05:32PM (
#23189830
There's nothing to stop you from exclusively using reference-counted smart pointers and garbage collection in C++, for some or all of a project, if that's really your thing.
For me, C++ destructors (each object responsibe for it's own storage) remove most of the hassle of freeing storage, and I've never hankered after garbage collection.
Share
Anecdotal experience
Score:
, Interesting)
by
raw-sewage
( 679226 )
writes:
on Thursday April 24, 2008 @05:38PM (
#23189966
I've been in the "real world" for about six years now, after graduating with a computer science degree. I'm currently in Chicago, Illinois, USA. I've spent the past several months looking for a good software engineering job, both in the Chicago and Milwaukee (Wisconsin) areas. Just from this experience, my take is that Java and C#/.NET technologies are hottest right now.
My first job was using C and C++. This was partly due to historical reasons (the application was about 12 years old), but also because the API for the platform was only in C. Shortly before I came in, and during my tenure there, we were trying to move more towards C++ and build a more object-oriented framework. My current position is at a high-frequency trading firm. All our software is custom and mostly C++ (some C here and there, and a handful of Perl to glue things together).
So based on this experience, when I was looking for a job, I was focusing on C/C++ positions. What I found is that there aren't a lot of people looking for C/C++ developers. In Milwaukee, virtually all of the demand for C/C++ programmers was for embedded systems. In Chicago, there was little demand for experience in those languages outside of embedded systems and the finance industry (which I was/am trying to get out of!).
This is just my casual observation of a relatively small portion of the software engineer landscape as a whole.
On top of a diminishing demand for C/C++ programmers, I found that quite a few companies who were looking for Java/C# programmers wouldn't even consider C/C++ people. The languages aren't all that different, and the concepts should definitely be portable. I think knowing concepts, understanding programming ideas/patterns, problem solving, etc, are more important that knowing the specifics of a particular language. Shrug.
Share
Natively-compiled languages
Score:
, Interesting)
by
radarsat1
( 786772 )
writes:
on Thursday April 24, 2008 @06:18PM (
#23190686
Homepage
I'd _like_ to stop using C++, frankly, but I don't seem to have a choice. A lot of my work depends on real-time capability, the kind of speed that is still only really possible on natively compiled languages that don't do dynamic typing.
I don't even mean hardcore real-time mechanical nano-second control of knife-wielding deathbots, just simple, This Must Run As Fast or Faster Than The Rate At Which It Will Be Converted To Analog. Python and Java still don't replace C in this area. (Mainly audio, video, and high-speed mechanical control.) And when it gets complex and you need to get into object oriented models to simplify the programming, there is unfortunately no real alternative other than C++. Combine this with that fact that there are a bunch of great libraries out there written in C++ that would be very difficult to replace, and you're stuck with it.
(I sort of oscillate between liking C++ and hating it, but I'm preferring straight C more and more these days. But like I said, I don't always have the luxury of choice, depending on what libraries I need to use.)
All these other languages mentioned (Java, Python, Ruby, PHP, Perl, etc) do not compile to native code, and all do dynamic memory management. Hell, that's exactly what makes them *good*. But unfortunately they're not so good for real-time tasks.
For real-time, you need deterministic memory management, and native speed. I've been looking at some other languages that compile to native code these days, like
[digitalmars.com], or
Vala
[gnome.org], but I haven't really decided yet whether I can start using them on serious projects.
I'd really like to learn more about functional programming in this area, too, but there seem to be very few functional languages that are designed for real-time.
FAUST
[sourceforge.net] is one, but it's only for audio.
Anyone know any other good natively-compiled languages that actually have well-implemented modern features?
I wish it were possible to have a compiled version of Python, for example, but there are many dynamic features it depends on. (Some stuff could be done in Pyrex, which is a pretty cool little project, but so far I've only used it to make bindings to C libraries.)
Share
The C/C++ language space needs to evolve
Score:
, Interesting)
by
CoughDropAddict
( 40792 )
writes:
on Thursday April 24, 2008 @06:31PM (
#23190892
Homepage
I am a die-hard C and C++ advocate. I consider it a high priority to make sure that the JVM and
.NET aren't the de facto future of all computing, which seems like more and more of a risk when you see things like
Singularity OS
[wikipedia.org], which is an OS where all application code
must
be managed code. These managed code people go nuts and think that everything should be managed.
The current generation of managed code VMs clearly have some benefits. But but they fall far short on some of the key properties that make C and C++ so powerful. Even if I grant you that the JVM and
.NET have caught up to C and C++ in speed (which I still don't believe has been demonstrated), it's undeniable that
VMs have comically bloated memory footprints: between 2x and 30x comparable C programs according to benchmarks:
JVM
[debian.org],
Mono
[debian.org]. Even if you consider memory cheap, smaller is
always
better because it means fewer bits flying over the bus and better cache utilization.
VMs stop the world to do garbage collection. Point me to all the articles you want that explain how "it's getting better" and "they've figured out how to make it real-time," but that doesn't change the fact that you're stopping all threads whenever you garbage collect, which is making your latency suffer.
C and C++ are the only game in town for getting the best performance
and
a small memory footprint
and
the ability to have the lowest possible latency.
That said, I think that C and C++ are becoming harder to justify when you consider the havoc that memory errors can wreak. It's highly embarrassing to vendors and damaging to their customers when a buffer overflow exploit is discovered. malloc and free, even when used correctly, can still have some forgotten downsides like the
memory fragmentation that was discovered in Firefox 2
[pavlov.net], and took some very smart people a lot of work to address.
What I would like to see is a language that gives the benefits of C and C++ (extremely fast, extremely small memory footprint, and no GC pauses) but that is also immune to C and C++'s weaknesses (memory corruption, memory leaks, memory fragmentation). Yep, I pretty much want to have my cake and eat it too. Why do I think this is possible? I think that the future is to have a fully concurrent, compacting GC. Everyone's telling us we're going to have more cores than we know what to do with soon, right? Well why not use all those extra cores to do GC in the background? Even if it's more expensive on the whole, we barely know what to do with all those extra cores as it is. With this strategy, you could get the performance guarantees and low overhead of C and C++ (on the real, non-GC thread, that is) without having to give up GC or suffer from memory fragmentation.
I'm also not willing to give up the option of dropping to C or C++ (or even assembly language) when it's justified. Mention JNI in a room of Java people and observe them reel in horror -- it's culturally shunned to deviate from "100% pure Java." Maybe this is a good value when you're on a big team of people writing a web app, but for systems and multimedia programming this is silly -- inner loops are inner loops, and some of them can benefit from machine-specific optimization.
Theoretically you could experiment with the fully concurrent GC using an existing language/runtime like Java, but I've sort of given up on the JVM and
.NET communities, because they have empirically demonstrated that they culturally have no regard for small memory footprint, low overhead, short startup time, etc. They just don't consider huge memory footprint or ridiculous startup times a problem. This is not to ment
Read the rest of this comment...
Share
Re:so what?
Score:
, Insightful)
by
dreamchaser
( 49529 )
writes:
on Thursday April 24, 2008 @04:51PM (
#23189234
Homepage
Journal
I consider a proper coder to be anyone who can write a proper flowchart and the pseudo-code/logic for their target application. It has nothing to do with the language they finally use to implement.
That being said, I agree with you otherwise. The first thing I thought of when I read the summary was 'lazy coders' when garbage collection was cited as a driving factor. That's the sad fact; many of the kids being cranked out of schools today can't code their way out of a paper bag without a compiler/interpreter that does most of the dirty work for them.
Yeah I know. Get off my lawn.
Parent
Share
Re:
Score:
, Insightful)
by
moderatorrater
( 1095745 )
writes:
The first thing I thought of when I read the summary was 'lazy coders' when garbage collection was cited as a driving factor
While I somewhat agree with you, there are two things that I think you're overlooking. First, there are going to be bad programmers no matter what you do. Someone can sound good in an interview and turn out to be awful. Until everyone realizes that and comes to the decision that the programmer in question should be fired, they're introducing code to the system. Or, even worse, they're not bad enough to fire, but bad enough that it could be a problem. These people will always be there, so you have to try an
That's what's missing from my angry-old-man rants!
Score:
, Interesting)
by
roystgnr
( 4015 )
writes:
roy AT stogners DOT org
on Thursday April 24, 2008 @05:24PM (
#23189718
Homepage
Who's going to bother listening to my "back in my day, we programmed uphill in the snow both ways" stories when I don't even bother to use a monospaced font!
And before I started up my 80x25 terminal window, I tied an onion to my belt, which was the style at the time.
Yeah. Much better.
Parent
Share
Re:C++ is as good as C# _if_ used correctly.
Score:
, Insightful)
by
pclminion
( 145572 )
writes:
on Thursday April 24, 2008 @04:57PM (
#23189324
GC is available for C++, but IMHO inappropriate. One of the great advantages of C++ is that the construction/destruction mechanism, along with automatic variables, gives you absolute control of the lifetime of every single resource. Whereas a garbage collected language like Java gives you absolutely no control over when (if ever) an object is destructed. I think it is a little wacky to give up this total control of object lifetimes in return for such a puny benefit, a benefit which could easily be achieved through C++ resource management techniques like RAII.
And anyway, garbage collection is irrelevant if you never "new" anything in the first place.
Parent
Share
Re:C++ is as good as C# _if_ used correctly.
Score:
, Informative)
by
pclminion
( 145572 )
writes:
on Thursday April 24, 2008 @06:16PM (
#23190642
Programs which use STL containers instead of manual memory management are "trivial?" This is news to me.
Avoiding the use of "new" is not the same as avoiding dynamic allocation. You simply let the containers handle it for you. Yes, there are pointers flying around, but they are out of sight, and managed by code that actually does things properly for you.
Parent
Share
Related Links
Top of the:
day
week
month
272
comments
Microsoft To Replace All C/C++ Code With Rust By 2030
265
comments
Python Foundation Rejects Government Grant Over DEI Restrictions
207
comments
At Amazon, Some Coders Say Their Jobs Have Begun To Resemble Warehouse Work
187
comments
Ask Slashdot: Would You Consider a Low-Latency JavaScript Runtime For Your Workflow?
187
comments
The Great Software Quality Collapse
next
Bill Prohibiting Genetic Discrimination Moves Forward
575
comments
previous
The State Of Grayware On the PC
132
comments
Slashdot Top Deals
A programming language is low level when its programs require attention
to the irrelevant.
Close
Working...