January 25, 2009
What's the difference between a programming language and a scripting language? Is there even a difference at all? Larry Wall's epic Programming is Hard, Let's Go Scripting attempts to survey the scripting landscape and identify commonalities.
When you go out to so-called primitive tribes and analyze their languages, you find that structurally they're just about as complex as any other human language. Basically, you can say pretty much anything in any human language, if you work at it long enough. Human languages are Turing complete, as it were.
Human languages therefore differ not so much in what you can say but in what you must say. In English, you are forced to differentiate singular from plural. In Japanese, you don't have to distinguish singular from plural, but you do have to pick a specific level of politeness, taking into account not only your degree of respect for the person you're talking to, but also your degree of respect for the person or thing you're talking about.
So languages differ in what you're forced to say. Obviously, if your language forces you to say something, you can't be concise in that particular dimension using your language. Which brings us back to scripting.
How many ways are there for different scripting languages to be concise?
How many recipes for borscht are there in Russia?
Larry highlights the following axes of language design in his survey:
- Binding: Early or Late?
- Dispatch: Single or Multiple?
- Evaluation: Eager or Lazy?
- Typology: Eager or Lazy?
- Structures: Limited or Rich?
- Symbolic or Wordy?
- Compile Time or Run Time?
- Declarational or Operational?
- Classes: Immutable or Mutable?
- Class-based or Prototype-based?
- Passive data, global consistency or Active data, local consistency?
- Encapsulatation: by class? by time? by OS constructs? by GUI elements?
- Scoping: Syntactic, Semantic, or Pragmatic?
It's difficult to talk about Larry Wall without pointing out that Perl 6 has been missing in action for a very long time. In this 2002 Slashdot interview with Larry, he talks about Perl 6 casually, like it's just around the corner. Sadly, it has yet to be released. That's not quite Duke Nukem Forever vaporware territory, but it's darn close.
While interesting, I have to admit that I have a problem with all this pontificating about the nature of scripting languages, and the endlessly delayed release of Perl 6. Aren't Mr. Wall's actions, on some level, contrary to the spirit of the very thing he's discussing? The essence of a scripting language is immediate gratification. They're Show, Don't Tell in action.
In fact, my first programming experiences didn't begin with a compile and link cycle. They began something like this:
As soon as you booted the computer, the first thing you were greeted with is that pesky blinking cursor. It's right there, inviting you.
C'mon. Type something. See what happens.
That's the ineffable, undeniable beauty of a scripting language. You don't need to read a giant Larry Wall article, or wait 8 years for Perl 6 to figure that out. It's right there in front of you. Literally. Try entering this in your browser's address bar:
But it's not real programming, right?
My first experience with real programming was in high school. Armed with a purchased copy of the the classic K&R book and a pirated C compiler for my Amiga 1000, I knew it was finally time to put my childish AmigaBASIC programs aside.
I remember that evening only vaguely (in my defense: I am old). My mom was throwing some kind of party downstairs, and one of the guests tried to draw me out of my room and be social. She was a very nice lady, with the best of intentions. I brandished my K&R book as a shield, holding it up and explaining to her: "No. You don't understand. This is important. I need to learn what's in this book." Tonight, I become a real programmer. And so I began.
What happened next was the eight unhappiest hours of my computing life. Between the painfully slow compile cycles and the torturous, unforgiving dance of pointers and memory allocation, I was almost ready to give up programming altogether. C wasn't for me, certainly. But I couldn't shake the nagging feeling that there was something altogether wrong with this type of programming. How could C suck all the carefree joy out of my stupid little AmigaBASIC adventures? This language took what I had known as programming and contorted it beyond recognition, into something stark and cruel.
I didn't know it then, but I sure do now. I hadn't been programming at all. I had been scripting.
I don't think my revulsion for C is something I need to apologize for. In fact, I think it's the other way around. I've just been waiting for the rest of the world to catch up to what I always knew.
The reason why dynamic languages like Perl, Python, and PHP are so important is key to understanding the paradigm shift. Unlike applications from the previous paradigm, web applications are not released in one to three year cycles. They are updated every day, sometimes every hour. Rather than being finished paintings, they are sketches, continually being redrawn in response to new data.
In my talk, I compared web applications to Von Kempelen's famous hoax, the mechanical Turk, a 1770 mechanical chess playing machine with a man hidden inside. Web applications aren't a hoax, but like the mechanical Turk, they do have a programmer inside. And that programmer is sketching away madly.
Now, I do appreciate and admire the seminal influence of C. In the right hands, it's an incredibly powerful tool. Every language has its place, and every programmer should choose the language that best fits their skillset and the task at hand.
I know, I know, I'll never be a real programmer. But I've come to terms with my limitations, because I'm a scripter at heart.
Posted by Jeff Atwood
@chromatic if it's not officially out, it's not out.
IMHO, Scripting languages are a subset of programming languages so the debate is moot.
It used to be that scripting languages drove other executable programs to get work done rather than doing it themselves - think about MS-DOS batch files, AmigaDOS script files, Unix shell scripts - those all call external programs to get work really done (but conversely, due to I/O redirection make such interfacing natural and almost part of the language). Of course, this definition isn't so useful or clear these days - but it's a starting point.
(I do, however, require turing completness, so I don't count HTML! :)
I would argue that the difference between scripts, and programs, is in their scope. A script is any program (any language) that does a single simple task. A script may also be any code that is started and terminates every time it is used.
An application might be something that does more than one thing, stays in memory over many uses, handles many user inputs, and needs to be more stable. Thus a script has no input other than command line.
A script can be written in any language, even in C/C++. The choice of language should not be because it compiles or not, it should be based on the merits and abilities of the language.
Here are just a tiny percentage of the applications on C/C++'s resume:
- All major Internet search engines (Google, Yahoo)
- Amazon.com's back-end
- All major Microsoft/IBM/etc. applications.
- Nasa's Jet Propulsion Laboratory
- All major operating systems (Windows, Unix, Linux, Mac)
- All major web browsers (IE, Firefox, Safari)
- All major web servers (IIS, Apache)
- All major databases (Oracle, SQL Server, MySQL, etc)
- All major word processors (MS Office, Open Office, WordPerfect)
- Most major development tools (Visual Studio, Rational Rose)
- Most commercial shrinkwrap games (Grand Theft Auto, World of Warcraft)
- All major 3D software (Maya)
- All major image editing software (Adobe Photoshop, Gimp)
- All major sound production software (SONAR, Pro Tools)
- Most voice/image-recognition software (Dragon)
If these types of applications aren't your thing, that's fine, but let's give credit where credit's due. C++ has succeeded for a reason, and that reason is not that programmers are masochistic elitist assholes.
C programming is neither harder nor better than (for example) .NET programming. If anything it's *easier*, because it has fewer parts.
If you can deal with all the complexities of a feature-rich language like C#, you can learn C in your sleep.
And that, Jeff, is why I've never quite understood your slightly-but-not-quite hostile stance towards C/C++. This is not an argument you need to make, it's not an argument you can win, and it's not an argument that would benefit you if you did win it.
So what gives?
These kinds of arguments about irrelevant distinctions just invite holy war. Why don't we argue about which programming language is superior?
Forth is the one true language that all others aspire to, because it encompasses everything from assembly code up through compiled and tokenized representations, to high-level interpreter.
If you really want to be a programmer, write a Forth system. In Forth. Then port it.
I'm a programmer at heart - while I have some ability to work with scripted languages, I'm used to the compiler picking out my errors as I do everything at once.
You're a real programmer if a comment says why this is done (e.g. you can explain why that piece of code is there). it doesn't matter which language or interpreted or compiled.
So this how it went:
1. Add a few statements to a script file in an attempt to fix a report bug.
2. Launch the app in debug using a custom-made debugger.
3. Wait 10 minutes to get at your first new statement.
4. Oops! THERE'S A TYPO IN YOUR CODE!
5. Fix typo.
6. Repeat steps 1 to 5 three more times.
7. Start actually running your code now that it's typo free.
8. See that your first naive attempt didn't quite do it.
9. Rework new code.
10. Repeat steps 1 to 9 three times.
About 10 to 20% of the development time was wasted fixing stupid shit a compiler would've caught. Yes, that's at least one entire afternoon per week of pure time wastage.
I realized then that the only people who would use JS to write a PC app would have to be RETARDS!
(Yes, I'm coming off harsh. But so would you if you had been there.)
I'm glad you wrote this piece. Because I'm a scripter too! :D
I followed long way from Fortran, Pascal, Delphi and Java to finally discover Python. And you know this strip from XKCD, with import antigravity? That was my moment of relief 3 years ago.
Quote What happened next was the eight unhappiest hours of my computing life.
Hahahah, eight hours? Talk about it, I have a similar story, although I learned C later on a 486, I learned assembler on Amiga when I figured I wouldn' be able to make games and demos in Amiga BASIC.
So I spent a WEEK running a tiny program (10 lines or so!) which sole purpose was to access one standard library and output a simple hello message in the CLI.
For one week I kept looking at these 10 lines trying to figure out what wasnt' working. Eventually, I realized the name of the library in the string didn't use the same casing as that in the book. Then, the program ran... and I cried! Never cried since for programming (just pulling hairs), but hey I was 17 or so?
I must admit that I’m the complete opposite, I’ve never ‘scripted’ before (with the exception of 30 minutes of frustrated Python screaming “why cant I overload that function?”!).
If asked to write a short program Id always reach for c#/Java, not Python etc. I suppose its what you’re used to and what you feel comfortable with!
I don't think the user really cares if it's scripted, they just want it to work. I think an important skill of a developer is knowing when to stop making things more complicated for yourself and just get the job done.
I've found 90% of the time, the differences between the technologies/approaches you use dont make a huge difference to the user. So if it doesn't matter that much, pick whatever you're comfertable with, the fact that you are more at home in a technology may provide a better solution than if you force yourself to use the 'correct' technology (within reason obviously).
I'd argue that drawing a line in the sand and just saying:
on this side is scripting and on this side are the *real* programmers
Is not useful, nor trueful. Like pretty much everything, it's a continuum.
It's not what you use, but how u use it.
Hi. I like your blog. I'm from Chile so I'm not very good at English.
Your article had remembered me the times when I had my Atari 800XL. I tried to wrote some Basic programs and its very pleasing. Next, when I tried to make C programs in my first PC I didn't understand anything. Only when I went to the University I've learnt how to program with compiled languages. I've impressed how many time you have to throw away to make a program that with Basic I only needed five minutes.
I like to say that doesn't matter what type of language one use. It's more important that if the program is solving the problem that if you use mega-ultra-design patterns.
I have to go. Bye. Keep writing!!
I agree with the majority of posts so far. If you are creating a system for others' use, then use whatever makes you the most productive and bother all the zealous / real men program in brand X crap.
On the other hand, if you are just goofing off, then pick your own poison. What doesn't kill you, makes you stronger.
I traversed from BASIC to C in much the same way as Jeff, but had a very different experience. The reason is that I had learned BBC Basic, which had a built in assembler, and slightly more powerful indirection operators than PEEK and POKE. In short, by the time I approached C, I already had an intuative understanding of what pointers were... and was frustrated by the pain of managing memory for myself (malloc was lovely by comparison)
Nevertheless, I also have to give a lot of credit to the book Illustrating C by Donald Alcock, which is the best introduction to pointers (and using pointers and pointees in algorithms) I have come across, and taught me far more confidence in the C landscape.
While Jeff went on to be a scripter, I spend my days writing device drivers and network stacks... in C. But if I was asked to write something higher level - well, there the more controlled, managed languages (be they scripting languages like python, or compile/run loop languages like Java) win every time.
I'm noticing here a lot of people complaining that some things don't work like their favourite tool. Which is a fine thing to do, if you are happy to use the hammer you know and love to remove the couplings from pipes because you don't get on with wrenches. Or if you're happy to limit yourself to hitting nails into things. Jeff doesn't use C, because Jeff doesn't write low level code (and it seems to me, he is perfectly happy about that).
There may come the day Jeff isn't able to solve some low level problem because he doesn't understand everything that is going on with the system below his code - but I get the same thing when bits of hardware don't work as expected... my Verilog is rusty and, for the most part, I'm happy about that.
Any scripting language IS a programming language.
Unless by programming language you mean having access to OS API.
programming is the act of synthesizing operations, or is it not?
Wait a minute, wait a minute. Tell us more about the nice lady who tiptoed upstairs to woo you from the keyboard!
Why are developers such wieners? :P
Your pretty down on C all the way till the end where you give it the worn use the best tool pat on the back. It should be noted that in order to do anything with your scripting languages, you need some libraries that are written without them. You are a user facing programmer, and you program stuff to look pretty and be nice to the user, but don't forget that your perspective on the grand world of programming is pretty darn well limited.
Webkit, trident, gecko, the things that provide everything you need to make things of beauty.
What about the .Net or Java?... Your operating system?
What you've found is that it's pretty stupid to program user facing programs in low level languages. That's a daily wtf no duh, except in some cases where portability is a big deal. Enjoy your scripting, and don't forget that for every hundred of you, there's a real programmer on whom your precious script depends.
The comments on the thread remind me what I hate most about working with computers: all the macho dimwits that think getting a computer to do something is hard. Wake up! Smell the coffee!
Try getting people to do things, try getting your girlfriend to make you happy ;0)
Most of you will spend the rest of your life in misery bashing out code or even worse, managing other coders because you don't realise how important it is to be able to communicate with people.
Jeff is trying to help, give him respect!
Running a quick unit test for a new method can give that instant gratification. Another reason to unit test :)
Please excuse me for going go off-topic in order to shed some light on a few inaccuracies:
Perl 6 is released for quite a while now. Still incomplete, but usable and true to the release early, release often paradigm. Also, as Larry Wall points out, Perl 6 is not his piece of work: Perl 5 was my rewrite of Perl. I want Perl 6 to be the community's rewrite of Perl and of the community.
It has been in development for quite a while now, true, but most people really involved with it expected a big timeframe like this (i.e., ~10 years). After all, their goal is big: Perl 6 includes operators and language constructs not found in any other imperative/object oriented language (except Python, perhaps :) ), some of them comparatively new in language theory.
Moreover, the most convenient Perl 6 constructs leak into Perl 5, so if you're using 5.10, you're already partly in Perl 6 land.
Its always hard to draw the line between scripting and programming. personally i think the very nature of the two concepts makes the line impossible to draw. if we assume that to do programming you must do memory allocation then almost nothing is programming anymore and almost everything is scripting.
For me scripting has been more about telling programs what to do on a high level, where as programming is about creating a set of instructions to solve a problem yourself. scripts are data files loaded by some executable whereas programs are the executable itself, which means that stuff like VB6 is especially blurry using my measures as it can be interpreted or compiled...
IMO the true realisation is that all languages are programming languages, but you can use programming to produce powerful scripts. hence scripting languages, in order to be powerful, should provide programming constructs and tools...
From my perspective you've confused high-level with scripting and low-level with real programming... its all programming, just with different levels of abstraction.
'Unlike applications from the previous paradigm, web applications are not released in one to three year cycles. They are updated every day, sometimes every hour.'
Which is why scripting languages are simply too slow and unproductive for use in the modern world.
Sure, back in the 1980s, it might have taken 4 hours for some primitive 8 bit computer to check your program for correctness. But that was then - get over it.
Outside games and system programming, you don't need to manually track memory allocation because you don't trust the computer to do it fast enough. So languages without some form of automatic memory allocation are a niche market.
Similarly, scripting languages that report errors at run-time, instead of at typing-time, are pretty much a niche technology. The last edge-cases where you had to manually check your code because you didn't trust the computer to cross-check it fast enough went away a few years ago.
I suppose some dinosaurs haven't noticed yet...
Jeff bashes Perl. Film at 11.
No seriously, implementation work on Perl 6 did not begin until about 2005. There is now an implementation of Perl 6 on the Parrot VM, called Rakudo, which is unfinished but complete enough that it actually makes sense to start writing simple scripts in it.
C'mon Jeff. You wanted to be shown. Here's your chance, if you are actually serious:
Like so many things in life, the difference between scripting languages and proper languages is exagerated. Just pick one you feel comfortable with.
It should allow you to be productive and not limit your freedom too much (a problem with quite many strongly-typed languages).
Interesting, these days, using Python feels a lot more like programming than Java... In Java all I seem to do is set up a bunch of config stuff for spring, os workflow, hibernate, kodo, etc...
4 years and a BS in Computer Science and all I ever do is use Perl. (shrug)
Jeff Atwood said:I hadn't been programming at all. I had been scripting.
That makes absolutely no sense.
The actual grammar and rules that a computer language uses has no relation to whether it is easy to learn or hard.
And there is no distinction between scripting and non-scripting languages. These are all programming languages, and using them consits of programming.
The only difference is that some languages do certain extra things for you at the price of being slower. That's all.
One needs to use the right tool for the job.
This blog to me sounds like someone who has forgotten the ends for which he or she started to learn and master the means for. Why do computers exist? To solve problems. Why do programming languages exist? To give us a way to tell the machine what to do. There are some problems for which a compiled language, like C/C++, are better. These would be problems where size and speed matters. There are problems where scripting languages are appropriate. This would be small problems for which building a C application is overkill, or problems for which the solution is we based, in which case, the portability is required.
Stop focusing on whether a language is compiled or interpreted. Start focusing on categorizing problems and determining which language is appropriate.
@Alex: Don't feel bad, I'm in the same boat. Very occasionally I see something else as useful, but rarely.
you hated C? how odd... would have pegged you for a compiled language kind of guy.
Me, I loved KnR C (even pre-ANSI).
Wrote all sorts of stuff with it.
Animation tools, Ray Tracers, and even for school work.
But I still spend MOST of my time doing Perl5.
Forced to put my finger on it, I'd say it due to a mix of
not having to compile, and having all the OS commands easily available.
Meanwhile, Perl6 sounds a bit non-backwards compatible from the small snippets I've read. That will greatly inhibit it's adoption, though the long delay and the odd perception that it's a 'mature' language at this point doesn't help either. The death of perl is overstated.
Congratulations Jeff. It is always nice to finally discover one's self. Even if it means choosing a paradigm that goes against the popular grain.
web applications are ... are sketches, continually being redrawn in response to new data.
Some are. I hope Paypal is not.
The difference between programming and scripting is not the language but the mindset. If you approach your program as a one-off, throwaway kind of solution, you're scripting. If you don't handle errors in your program because you expect to be watching it every time it's run, you're scripting. If you approach your program as a thing that might be around for years and better work right even when you're not there, you're programming. If you're writing software that somebody else should be able to use without intervention from you, you're programming.
Neither scripting nor programming is wrong. They are only wrong in context. The most common, most dangerous mistake is when a script gets promoted to the status of a program. How many times has an Excel spreadsheet, created as a one-off, been turned into a core business artifact? How many times has that spreadsheet had a serious, silent bug in it?
It took me a long, long time to learn C. Five years, really, to be comfortable with the basics. And I still don't know it as well as I should.
As a child, I didn't like it for the same reasons you didn't: confusing compiler errors, having to think like a machine, putting time into fixing the *mechanism* rather than fixing the problem. No instant gratification. BASIC was much more fun.
But C is incredibly rewarding. Like anything that requires a lot of effort, there is a pay-off, and the pay-off is: when you write a line of C you have a pretty good idea of exactly what the machine is going to do. Not a completely clear idea, thanks to modern optimising compilers. But a pretty good idea.
And you'd be right to imply that knowing exactly what the machine is doing is completely irrelevant for most modern programmers. But there are niches:* systems programming (my niche), graphics (though this will eventually change), and -- of course -- someone's got to write the interpreters.
Ultimately it doesn't matter what the purpose is, because, for me, being a programmer is about being curious. Learn a few programming languages in the same way that you might learn Spanish, and for the same reasons: it may not help you out immediately, but it opens up your world a little more.
* I don't know if I agree with Joel's argument, that it helps you understand the performance implications of higher-level code, simply because the abstraction is too great. Sure, it's nice to know that repeated string concatenation in Python is slower than building a list and ''.join()-ing it, but the manual tells you that, and when you really need the knowledge (will re.find() be too slow over this data set?) knowing that the re module was written in C won't help.
Quote What happened next was the eight unhappiest hours of my computing life.
How about passing from GwBasic user manual reference as the first programmer book into Advanced Assembly Language on the IBM PC as your second?
Beat that as a new meaning of unhappy hours ;)
Love your blog.
Chiming in re: Perl 6: No one seems to get what it is all about.
So to fully understand the scope of what's happening, here's an analogy: it's like the community is writing the Java language for the first time, only the language is much more complex and intricate than that Java language. Java also has its own virtual machine, so the community is inventing a virtual machine. Not only does this virtual machine run the language it was developed for, but other languages as well. And it's not just for running other languages, other languages will be able to share modules (like, a Python script can use a Perl module) because of the virtual machine implementation. This is a big, huge deal. It's not portrayed by the community as such, because they realize it's going to take a long time, and don't want people to become overzealous over something that will be released next Christmas as they say. They've received funding by Mozilla Foundation and other great offers, but to continue the Java analogy, it's not like they have Sun sitting behind them. This is an amazing open source project, mind-boggling, really, so if people can start clearing up the P.R. on this project (Perl people aren't generally good at P.R.: they just care about writing beautiful, jedi-master code), then that's a step in the right direction.
I have a very similar background. I wasn't a real programmer until I learned C and somehow passed algorithms.
Interesting, these days, using Python feels a lot more like programming than Java... In Java all I seem to do is set up a bunch of config stuff for spring, os workflow, hibernate, kodo, etc...
What happened next was the eight unhappiest hours of my computing life.
Ever wonder what might have happened if you had taken her up on it and gone downstairs? Probably the same thing only a day later I guess, but you never know.
This language took what I had known as programming and contorted it beyond recognition, into something stark and cruel.
what you needed for a first compiled language was Turbo Pascal.
As a college student I usually think that learning C is important, but not fundamental. Actually I think you should learn a low level language, any language really. Pascal is just as good for that as C.
You don't need to be a kernel hacker. You need to know what is below the abstraction. You don't need to build a complete database in C, you need to know how to manipulate a B-tree in C/Pascal/whatever, you need to know how to deal with graphs and at least know the algorithms (but not have them on your head all the time).
I my experience I had 3 semesters of hardware classes. I built my own 16 bit ALU using only logic ports. I built my own 8 bit processor (around 8 instructions) with logic ports, state machines and VHDL and I found that almost as important as my data structures class taught in C.
In summary, you need to know how the abstraction you are working with actually do. You don't need to know how build that abstraction all over again.
It is practically impossible to teach good programming to students that have had a prior exposure to BASIC: as potential programmers they are mentally mutilated beyond hope of regeneration.
- Edsger Dijkstra
I love C and C++, it's a pleasure to program with them.
You just had a very bad start with C, give it another chance! ; )
I don't get it. Are you trying to say that Mr. Wall can't speak about anything publicly until Perl 6 comes out? If not, what difference does it make how much he pontificates about the differences in scripting and traditional languages?
Another shitty fatwood blog entry. You used to be good.
C is hard. Let's go blogging!
What? I must be missing something here... the point at which compiling and linking occurs during the development cycle has a profound impact an what programming vs scripting is?
This is a non-argument, you could have just tweeted I like REPLs. I feel like you feel there's something more profound here... but I'm missing it somehow.
My programming path was BASIC-assembler-Pascal-FORTRAN-C. If you learn assembler first, learning C becomes a joy because you already know the concepts and can implement them more concisely. If you learn Pascal or FORTRAN first, learning C becomes a joy because C is far more productive.
Actually, my most favorite programming tool was a wire-wrap gun.
I'm an engineer at heart. Scripting languages give me hives - they tend to be so flexible that I can never figure out how a given problem is supposed to be done - either the syntax changes in strange ways (Perl), the language itself changes (AppleScript), or there's a strange mish-mash of standard and optional features (pretty much everything else).
I *like* static typing, which keeps me in tune with planning code instead of just writing it willy-nilly. I *love* designing data structures and object relationships - once you grok it, it's nearly impossible to go back to procedural cruft, or the sorry state of objects in most scripting languages. I *like* that variables, calling conventions, etc are always consistent and follow a clear grammar. I'm sorry, but I don't get any of that out of scripting languages.
Now memory allocation and pointer manipulation - that I can't say I miss terribly. You could achieve some very elegant and high performance code, but the ease of making catastrophic mistakes was just too high. Overall, I'd say I love Java most for its syntax similarity to C, its huge and flexible API, and lack of pointers and memory management. If only I could easily and simply write native code with it, rather than be shackled to a JVM.
It's all programming. Scripting doesn't require compilation. Having to go through compile/link cycles might make you a bit more careful, but the experience of just run and encounter the bug after you've trashed your data has a similar effect.
The barrier to entry might be lower with scripting languages, but at some point the complexity of the job forces you to either start becoming a real programmer or find another profession.
Learning C helps. A lot. But I also see young developers that have managed to become perfectly decent programmers without ever having touched anything else but scripting languages. Of course, those are exactly the guys that *want* to learn C, sooner or later...
they are updated every day, sometimes every hour
There are a lot of applications that need software that is validated before shipping. Would you want to be laying in an CT scanner knowing that they updated the software every hour? You tend to forget that there is more to programming than making a web application. Like others said, there are ways to structure a large application so that you don't need to rebuild it all to compile, link and test your code. For what I am doing, it often takes longer to reboot the application that it does to compile and link my DLL.
If you can't grok C, then you can't program. Period. You might be able to cite from memory which package you should be importing, or why oneshould be using System.IO.StringReader, but THAT'S NOT PROGRAMMING. I shouldn't have to wade through hundreds of links and spend 20 hours just to output to the damned console.
C is simple, and it forces you to think about indirection. It doesnt' mask things from you or hide things from you and most of all, it hurts less than a root canal - which isn't something I can say for some modern scripting languages.
[Giant I'm not trying to start a holy ware here sign]
You had an Amiga-1000. You tried learning C from the KR, you failed, you turned out a Microsoft guy. Reading CodingHorror wont be the same from now on ;-)
(I think I've busted my floppy drive swapping disks trying to run Lattice-C on an unexpanded Amiga500).
So what computers are these screenshots of? I'd guess that the first is an Apple ][, the second an Atari 00(XL)?, and the third a C64. Right?
(Atari 600XL was my first computer. 16 KB RAM. Yay! And I also learned C on the Amiga and have now switched over to Python more or less exclusively - it's just a hobby for me, and I haven't had that much fun programming since I was a teenager...)
Ah, I see - the filenames of the images give it away...So at least I guessed right...
I disagree . For writing great programs in scripting languages you will need to have “real programming” knowledge.
You’ll write shit-script-code if you don’t know about data encapsulation: you won’t make use of powerful things like object oriented features of language like Perl or Python. Not only in C, but in scripting languages too you can introduce deadlocks and race-conditions when you are running your code in multiple threads or processes. You will face the same problems maintaining hundreds of thousands lines of script language code as with the same amounts of “real programming” langauge code.
'Scripting' *is* programming. There's no difference. There are no 'real' and 'fake' programming languages, if they let you solve problems and express algorithms, they're all real.
And to be frank, between a programmer that doesn't know what a pointer is but knows what higher-order functions are, and another that knows pointers but doesn't know higher-order functions, I'd pick the former. Pointers are just an implementation detail, which are not by any way imperative to express algorithms. They're clumsy, unsafe, and frankly I don't care how much of a leet hacker they make you feel, they're still responsible for a myriad of unnecessary bugs in thousands of applications.
Is that what we want from our programmers? To solve the *same* problem again and again, instead of dealing with new, novel problems? Ok, so you know pointers. Great. Encapsulate them into a nice library implementing the most important data structures useful for coding(vectors,lists,stacks,dictionaries,tuples), and start programming using those structures, not messing every minute with every dirty pointer trick in the book, which leads into undefined behaviour and memory leaks 90% of the time. That's what 'scripting' languages do. They use the best bits of low-level languages, and then expose that functionality into a high-level environment. They're intented for human programmers, not human compilers.
And yeah, I know C.
When I was a kid I read somewhere that C is a great and professional language, so I was really determined to try and learn it. That time I had a small 8-bit machine with a tape recorder as a storage device. As strange as it may sound, there was a C compiler for this technological wonder. It was called Hisoft C and trying to do anything with it was quite a torture. How to compile your program? Simple - rewind the tape to the beginning of stdio.h, press play. If you're lucky, the compiler will load stdio. Then it will start compiling the program. If there are no errors (and usually there were either syntax or tape loading errors), rewind the tape to the beginning of the standard lib, etc.
On the other hand, Hisoft Pascal was a very good compiler (considering the limitations) - the compiler + editor + program could fit nicely in 32KB of memory and I could run my programs without the constant tape rewinding.
I guess this is why I used to think C was made for rich people who can afford expensive computers with good disk drives, while for everyone else there is Pascal and Basic. :)
What do you think Perl, Python, and PHP are implemented in? C of course. And that, I think, is the niche where C really is the best choice: when you really need to write your own object system, or your own garbage collector. Those low-level bits that are needed for a language runtime.
The differentiation is more how it's used than the language itself.
There are C interpreters--all of them seem to say that their purpose is to add a Scripting ability to your app--In that case, C is a scripting language, how could you possibly say it wasn't?
It used to be that compiled languages took longer to see a response after an edit, now almost any app in eclipse runs the second you hit the button, so that's meaningless.
You can SORT of make a connection between compiled/not compiled and runtime speed, but that's really flaky (Compiled languages tend to be about 100x faster from what I've seen), but that opens up the whole area of VMs which can be nearly as fast as C (Java) to as slow as interpreted (Old versions of basic, the tokenizing has to count as VM as far as I can see.). If a language is on the faster side of that range, it's almost certainly not going to run without a compilation step of some sort.
I really think the biggest reason to differentiate is so that programmers can inflate their egos by partitioning Them from Us to prove how much better we are than them.
I remember Mr. Atwood's out-of-character post at The Daily WTF, as it was inspiring, as was many of the other posts here that encourage one to improve one's skills. I keep hoping this blog will turn into something like Raymond Chen's Old New Thing, you know, this is why Windows is crufty, because we had 16 bit conventions and we liked keeping customers.
I'm not disappointed that this blog is different, but still it seems like you are swinging a baseball bat through a china shop with your metaphors, with commenters spinning off on trivialities. Ah, there he goes breaking another one. I find it appealing in a different way, like watching a car crash with a scripted explosion. ;-) uuuh.
WTF am I typing. Oh, Mr. Atwood should be jumping onto things like Clojure, because the promise is that low-level concurrency primitives are abstracted away (one has to discard assumptions about mutable programming though). Higher and higher we go, let go of memory control, now let go of concurrency control.
Mr. Atwood's main point is that he rather stay in the higher abstraction strata, with non-deterministic resource cleanup, getting things done, no earth shattering kaboom. Which is perfectly fine, creating stackoverflow.com has garnered more better kudos (TM) for Mr. Atwood that this blog. (I am pressing the C++ tag multiple times a week. The answer is RAII).
I take umbrage at your revelation that your programming as mere scripting, though. Is this as deep as this blog will go? I feel like you're channeling a reminiscent Scotty, sitting on the mock-up bridge of the Enterprise on the ST:NG Holodeck. No more tech manuals for you.
What happened to all those inspiring posts of old? :-/ Why not point the spotlight on what is up and coming in programming? Things like Clojure are a step towards in your awaited paradigm shift (OMG douse me with hype-suppression). Don't get stuck on your limits. Explore. And make sure to report back here.
Quit making lame excuses for not learning C!
The people who insist that programmers should know/learn C usually give some very specific reasons for it, which I don't think can be trivialized by redefining the problem away (as I'm a scripter, not a programmer does).
If the people insisting on C are actually using C to solve their day-to-day programming questions, then obviously you'd need to learn C to work with them. If they are insisting on C in a more general if you know C you understand how the machine works way, then figure out a way to explain that you do understand the machine without knowing C. For my part, I'd say: I used to do assembly language on 6502/6510 chips. C is a higher level language than assembly.
BTW, have you heard of Windows PowerShell? It basically allows you to call .NET APIs just by typing commands in a command prompt. So you can have all the power of .NET right at your fingertips. And yes it allows scripting as well. Add P/Invoke support and it would be perfect.
Whether a script or a compiled file or a stack of punch cards, it's not the medium that makes one a programmer. You differentiate programmer from scripter where you mean to differentiate programmer from engineer (or architect). Just as you can write readable/unreadable code in any language, you can program or architect in any language, whether hand-assembled native instruction code or the highest-level scripting language. It's not the with, it's the what.
It's funny, I've noticed that compiled-language programmers typically have a bit more disdain for interpreted-language programmers then vice versa. I'd argue against the idea that 'scripting' is somehow not 'real programming', as well as the notion that one type of language is 'more powerful' then the other - like most things, it's really a question of project scope and medium.
I work in a compiled language, but thanks to the features of my IDE, I get the quick turnaround of interpreted languages (most of the time).
Print Hello world!
Now that is a powerful computer language. It just does it, nothing fancy needed.
Both sides have advantages and disadvantages.
Interpreted stuff is an easier environment that holds your hand nicely and sort of guides you along, as long as slower execution times and having no idea what is going on under the hood isn't a problem for you.
Compiled stuff is a pain to get started in but gives you the fastest code you can get without switching overdrive to Assembler and lets you shoot yourself in the foot in any number of exotic ways.
I myself switched from Basic to C three times before it stuck. I was using one of the first versions of a really old compiler (Computer Innovations? Desmet? I forget, it was the precursor to the one Microsoft eventually bought out to release as their own compiler) and my first program resulted in a Dorlop error. I had written such a lousy program that it bugged out and spit out the weird message Dorlop error, obviously some weird bug, but kind of funny none the less.
I kept trying to finish projects in C but kept switching back to my old Basica standby till i got a firm graps on it and was ultimately rewarded with a program that was 1/3 of the size and 10X faster than its Basic equivlent and never looked back.
But its really all just tools. A carpenter isn't going to build an entire house using only a hammer, and a programmer isn't going to build EVERYTHING in their career using only a single language, well hopefully.
Need to create 1000 users on your Server? You're not going to write a C program to do that, you're going to script it. Likewise if you are building a raytracer or video game engine you won't script it.
The one thing I learned about those 8 hour hair pulling lessons is that you will NEVER forget the lessons learned. Formal education teaches you how to do something (like setting up a compiled environment) but practical experience usually ends up teaching you how not to do something (and many ways at that). Each are equally valid.
I spent about 30 hours trying to draw a dot on the screen in Assembly language by reading listings from an old IBM PC techncial reference bios listing. I finally learned about default number bases and that 10H didn't mean 10D and that different compilers had different default number bases. And that one simples example ended up teaching me about 25 things instead of the original goal of drawing a dot on the screen.
I never forgot that, even 27 years later. Painfull leassons tend to stay in memory much longer.
I had kind of opposite experience. As hobbyist I wanted to be immediately productive, so I started with QBASIC and VB6. But after mastering GOTO/IF-ELSE, I just couldn't come to terms with the QBASIC/VB syntax. Loops just seemed totally incomprehensible to me in BASIC/VB. VB was worse, someone didn't tell me a GUI designer doesn't make learning programming any easier. I tried VB.NET too when it came out, and the result was same. My programming ambitions didn't go much after after that.
In the first year of my CS Diploma, I just had to learn C. Till then I was put off by the idea of learning C/C++, as I always heard C/C++ is too hard not very productive. Like you I started with the real C book, but what nobody tells you that it isn't a book for someone who doesn't know programming before coming to C (even though the authors say in the preface that it's not an introductory programming manual). C syntax probably suited me a bit better I guess, because this time I got to switch, which I didn't understand with BASIC syntax. But I was again having a hard time understanding loops. Fortunately I stumbled upon the C Primer Plus, 5th Edition (www.amazon.com/Primer-Plus-5th-Stephen-Prata/dp/0672326965), which is an excellent introductory programming book which uses C as the teaching language (although at 900 pages and with C99 coverage it's a good reference book as well). I had little difficulty understanding loops with C Primer Plus, and programming seemed possible again. I will forever owe my programming career to that book. I didn't learn C very deeply, and nowadays I mostly do web programming. But the lessons I learned from C are still invaluable. So what matters it not the language you learn programming with, but whether the resources you are using to learn programming are the right one for you.
I do hope to go through the KR book once again, hoping this time I'll be able to appreciate it more like many others before me have. And early trauma does matter. I'm highly unlikely to ever touch any BASIC dialect again.
To make it really interesting, you have to also discuss whether HTML/CSS are actually scripting. This one gets some of my coworkers really fired up.
I agree with the masses.. scripting and programming are pretty much the same thing. Some languages may be more or less powerful than others in certain respects, don't get me wrong, but you usually pay for that in some way or another. We shouldn't discriminate between programming and non-programming on whether or now we use a compiler.
It's not like we're talking about the divide between markup language and scripting/programming language. Every time I hear someone say they're coding html, I giggle a little. :)
Unlike applications from the previous paradigm, web applications are not released in one to three year cycles. They are updated every day, sometimes every hour.
When I read a thought like this, I'm reminded of the movie Metropolis and the workers slaved tightly to vast panels of knobs that they must turn to keep the machine going. Perhaps it is this aspect of web application development that is so frustrating to me; a project that is updating daily is a project that is never finished, and a project that is never finished keeps you from new projects.
Your experience from Basic to C was backwards. C was a big improvement over *assembly language*, a breath of fresh air for assembly programmers. It gave them much more power and expressiveness. But coming from Basic, it's definitely much harder and seems crazy. At the time, efficiency was critical so it was worth the move. Now, we have excess machine cycles enough that programmer efficiency is more important and so dynamic languages (a term I prefer to scripting) have come into their own.
I love this blog, but I have to say that I don't like this type of posts because they tend to imply that there is a difference between the people that used scripting languages and people who use programming languages. They're the same. YOU ARE PROGRAMMING!!
And from personal experience; I am a C++ programmer, and I found that learning C / C++ first really helped when trying to learn other things, such as Lua of PHP, specially PHP since the syntax is so similar. But that's just me.
I believe that a programmer should understand programming rather than understanding a language.
sometimes i read your blog and just hear...
blah blah blah blah
I have seen far too many programmers fall in love with a particular languange, and then try to use it where another language would be easier/faster/more appropriate. This leads to all sorts of cases that fall under the heading If all you have is a hammer, then every problem looks like a nail. Yes, learning to program in C can be hard, but sometimes it is the best language for a given problem set.
In fact, any language is hard to learn when you don't know it yet. Don't develop Macho Programmer Mentality just because you are The Greatest Java Programmer Ever. The universe is wider than that. Scripting vs. compiled are just two sides of the same coin, and each fills a different niche. The true mark of a professional developer is the ability to learn a new tool when necessary, and the wisdom to recognize when to learn it, and when to apply it.
Where's the 'hey now Jeff' guy?
It really is too bad that those were the eight unhappiest hours in your computing life. Granted, learning C can be a trying and difficult process. Learning assembly, even more so. But for me, at least, it was also incredibly rewarding to actually come to an understanding of how the machine works, which is almost a prerequisite for C.
Sure, C is almost always the wrong language for getting things done (unless you're writing a kernel). But it's a hell of a way to get to know what's happening under the hood in your favorite script interpreter.
Big fan of Python _and_ C. I wouldn't write a driver in Python; I wouldn't write a database application in C (if I could help it). My experience was a lot like yours -- I still remember the day I was learning C and figured out that you had to allocate all your own memory.
The ego game that's played here between programmers and so-called real programmers is stupid. If it's your profession to create applications using whatever language suits you then you're a real programmer. Done.
So, what it looks like to me, Jeff, is that you were reading Larry's article and came out with two separate core ideas, each of which are actually only loosely related.
1. Scripting and programming are closely related, but different enough to warrant distinction.
2. Perl 6 appears to be an experiment in making scripting as powerful as possible, and it's a monumental effort that just keeps dragging on.
Point 1 is clearly the focus of your article, and in a way I agree with you. Scripting *is* programming, but it's got a different flavor and thus can be separated - and probably *should* be separated!
Larry's statement about human languages being turing complete allows for an interesting way to describe this difference. Personally, I would think of programming as the type of writing used in a dictionary or encyclopedia - everything is spelled out and explained, in as common terms as possible. In contrast, scripting would be when shorthand has been developed and used to communicate things more succinctly, and more simply, perhaps for a specialized set of tasks, like in a hospital.
- Dictionary entries
- Encyclopedia articles
- Non-technical essays
- Newspaper articles
- Training manuals
- Notation used in medical work
- Scientific terms and acronyms
- Legal terms and acronyms
- Computer terms and acronyms
There's clearly some blurring here, but the point I'm trying to make is that certain fields develop shorthand that gets used to express common concepts more easily and succinctly.
Note that I included slang in my list of human language scripting. This is because slang terms are often highly loaded with meaning, compressed to a few syllables, and can have different meaning in different contexts.
So, in summary, I see programming as the act of getting your ideas across using a set of very general terms, building up meaning to communicate your point. Scripting is spitting out whatever terms and phrases communicate your point most quickly, and relying on the proper understanding of your terms on the other side.
Scripting isn't *necessarily* easier to learn, but it almost always is, because you can typically communicate your point more directly, at least, if you're using a scripting language whose terms help support your problem domain. (libraries help, too!)
Anyhow, that's enough brain-dump for now... I have a meeting to run to! :)
My second language was assembler. And I loved it. I then went into machine code, for the fun of it. I think I tried COBOL next, and couldn't stand reading a single chapter of the book I bought (no Internet then, kids). Next was Forth, and my greatest joy was modifying the compiler. I'm not sure exactly what came next... there was MUMPS, APL, Logo, Prolog, Lisp, Pascal and C, not necessarily in that order.
As you may imagine, I had no trouble at all with C. :-)
But I'll tell you one thing. After a prolonged period in which the only coding you do is scripting, it is a BITCH getting back to programming. :-) :-)
Thankfully, I'm doing so with Scala, which has rather enticing scripting qualities. :-)
Personally, I had no trouble at all with pointers and memory allocation. I guess that comes from learning C on a memory mapped system, where if you want to be able to even see anything, you have to mess with pointers. I mean, it's just another number to me, nothing inherently difficult about it. But I suppose it would be difficult shifting from BASIC to C. My friend did it, and now he's dropped BASIC altogether.
Also, many large C/C++ projects have daily builds. The sheer fact that you thought the compile time was slow may have also been due to slower computers - nowadays, the compile time is negligable, if you put your project together correctly.
C isn't real programming,it's dog slow and results in bloated code.
Now, get yourself a copy of the superfast AsmOne version 1.02 assembler and do some real coding!
It's too bad that there are basically only 3 computing cultures left, nad that they're so similar: UNIX/C, Windows/C++, and Mac/NeXT/ObjC. For various historical reasons, these are all static compiled languages, and ended up migrating from the low-end to the high-end. The ITS/Lisp culture and PARC/Smalltalk cultures, for example, were every bit real as programming environments, but more or less died out when PCs moved up into academia and research, and interactive HLLs died out and had to be reinvented as scripting languages on UNIX.
You're definitely not alone in feeling alienated by the languages they teach in school: http://www.trollope.org/scheme.html
Jeff, we've released a stable version of Rakudo (Perl 6 on Parrot) every month for the last fifteen months in a row. We'll continue to do so. You could have had immediate gratification with Perl 6 at any point from *November 2007* until today.
(I wrote my first working Perl 6 code in summer 2005, and it's been publicly available ever since.)
It's okay for you not to know about that, but it's not okay for you to fail to do basic research and pontificate as if you had.
I couldn't agree more!!!! (referring to Jeff's post, not any of the above comments)
Programming is hard, why waste time?
I remember that, for instance, in a particular tool i had to make a backup of a file before overwriting it. I ended up sys-calling xcopy to copy the file instead of the proper C way (back then, early 90s i guess, MSVC v7): reading it, one piece at a time, then writing it back to disk, one piece at a time, repeat until complete. Ouch!
Ummm, a program is some stuff that makes a computer do some stuff.
What absolute fucking horseshit.