Hard Work and Practice
December 23, 2008 2:57 PM Subscribe
Tech publisher O'Reilly editors discuss the role of hard work and practice in programming and learning in general. "One aspect of learning programming that often eludes both students and teachers alike is the importance of practice, of actually working through all of these formal structures we teach. Most of our books, in a way, offer a promise of learning that avoids the slow repetition of practice."
the importance of practice, of actually working through all of these formal structures we teach.
I would agree if your plan is to become a software engineer. If your heart is already spoken for by some other enterprise, and you want to add programming to your toolbox, start off with a compilable chunk of code that does something you already understand and change a few parameters to watch what happens. You'll be learning advanced techniques when you find you need them.
posted by StickyCarpet at 3:05 PM on December 23, 2008 [1 favorite]
I would agree if your plan is to become a software engineer. If your heart is already spoken for by some other enterprise, and you want to add programming to your toolbox, start off with a compilable chunk of code that does something you already understand and change a few parameters to watch what happens. You'll be learning advanced techniques when you find you need them.
posted by StickyCarpet at 3:05 PM on December 23, 2008 [1 favorite]
I'm not sure I agree. Knowing what's possible is half the battle. Knowing how to do it, and when to do what is the other half. But once you know something can be done, and when it should be done, you now it.
posted by orthogonality at 3:18 PM on December 23, 2008
posted by orthogonality at 3:18 PM on December 23, 2008
Er, know it.
posted by orthogonality at 3:18 PM on December 23, 2008
posted by orthogonality at 3:18 PM on December 23, 2008
orthogonality, I'd argue that until you've actually used a technique for a few things, you probably don't know when it should be done, and there's a pretty good chance you don't actually know how to do it, either.
posted by hattifattener at 3:26 PM on December 23, 2008 [2 favorites]
posted by hattifattener at 3:26 PM on December 23, 2008 [2 favorites]
she's gaming the cheat codes in Sims, without really realizing that she's actually using the programmer's interface into the program and beginning to learn LUA.
Write Lua Right, dammit.
Also: so what about those Head First books, eh?
posted by Monday, stony Monday at 3:36 PM on December 23, 2008
Write Lua Right, dammit.
Also: so what about those Head First books, eh?
posted by Monday, stony Monday at 3:36 PM on December 23, 2008
I'm not sure I agree. Knowing what's possible is half the battle. Knowing how to do it, and when to do what is the other half. But once you know something can be done, and when it should be done, you now it.
Er, right. But in order to get to that point, you have to practice. For a lot of people though, programming doesn't feel like "work", depending on what you're doing of course, but those people can end up spending an immense amount of time programming without realizing how much practice they're getting.
posted by delmoi at 3:36 PM on December 23, 2008
Er, right. But in order to get to that point, you have to practice. For a lot of people though, programming doesn't feel like "work", depending on what you're doing of course, but those people can end up spending an immense amount of time programming without realizing how much practice they're getting.
posted by delmoi at 3:36 PM on December 23, 2008
In theory, there is no difference between theory and practice, but in practice, there is. Discuss.
posted by jonmc at 3:45 PM on December 23, 2008 [5 favorites]
posted by jonmc at 3:45 PM on December 23, 2008 [5 favorites]
You mean I can't just learn C++ in 24 hours?
Practice is important, but as my music instructors said, perfect practice makes perfect. There's little point studying the same routines over and over again, the way we teach mathematics. Rather, I think challenging yourself on something you aren't fully familiar with is the means of improvement.
posted by pwnguin at 3:53 PM on December 23, 2008
Practice is important, but as my music instructors said, perfect practice makes perfect. There's little point studying the same routines over and over again, the way we teach mathematics. Rather, I think challenging yourself on something you aren't fully familiar with is the means of improvement.
posted by pwnguin at 3:53 PM on December 23, 2008
Pff. I'll learn any new tech the same way I've learnt anyhting: being thrown in at the deep end with a bunch of broken code and being expected to make something of it.
Then I'll read up on what I've done later and realise what a miserable hack I am...
posted by Artw at 3:55 PM on December 23, 2008 [4 favorites]
Then I'll read up on what I've done later and realise what a miserable hack I am...
posted by Artw at 3:55 PM on December 23, 2008 [4 favorites]
I'm not sure I agree. Knowing what's possible is half the battle. Knowing how to do it, and when to do what is the other half. But once you know something can be done, and when it should be done, you [k]now it.
And then when the project finally comes where you know it is the time to do that thing that you know can be done, and you do it for the first time ever, with the book and 7 tutorials open while you code, and you are on a deadline, and the project gets shipped, and then a few weeks later you need to add a feature or maintain your code and you realize that you did not really understand all the implications and side effects of the code you wrote, and figure out that the examples you read looked so fucking great because they were meant to stand alone, but really suck in your real world case, that is when you will start wishing you had practiced when you first read about that thing that you know can be done, and will forever regret not doing so.
I would say that knowing what's possible is half the battle, knowing when to do what is the other half, and practicing before you have to use it in a real project is the two other halves.
posted by dirty lies at 3:59 PM on December 23, 2008 [4 favorites]
And then when the project finally comes where you know it is the time to do that thing that you know can be done, and you do it for the first time ever, with the book and 7 tutorials open while you code, and you are on a deadline, and the project gets shipped, and then a few weeks later you need to add a feature or maintain your code and you realize that you did not really understand all the implications and side effects of the code you wrote, and figure out that the examples you read looked so fucking great because they were meant to stand alone, but really suck in your real world case, that is when you will start wishing you had practiced when you first read about that thing that you know can be done, and will forever regret not doing so.
I would say that knowing what's possible is half the battle, knowing when to do what is the other half, and practicing before you have to use it in a real project is the two other halves.
posted by dirty lies at 3:59 PM on December 23, 2008 [4 favorites]
The difference between theory and practice is that, in theory, it doesn't matter.
posted by porpoise at 4:03 PM on December 23, 2008 [1 favorite]
posted by porpoise at 4:03 PM on December 23, 2008 [1 favorite]
I thought the whole point of computers was to avoid constant, repetitive things like practicing. And the purpose of APIs and abstraction are so you can avoid learning something that someone else, probably smarter than you, has already learned and prevented you from screwing up.
I don't agree with the above at all, there's a huge difference between someone who's learned theory and one who hasn't
posted by meowzilla at 4:14 PM on December 23, 2008
I don't agree with the above at all, there's a huge difference between someone who's learned theory and one who hasn't
posted by meowzilla at 4:14 PM on December 23, 2008
We're sitting here talking about practice, not a game, not a game, not a game, but we're talking about practice. Not the game that I go out there and die for and play every game last it's my last but we're talking about practice man. How silly is that?
posted by lukemeister at 4:15 PM on December 23, 2008
posted by lukemeister at 4:15 PM on December 23, 2008
Artw: "Pff. I'll learn any new tech the same way I've learned anything: being thrown in at the deep end with a bunch of broken code and being expected to make something of it.
Then I'll read up on what I've done later and realize what a miserable hack I am..."
Artw, are you sure that you're not me? That's pretty much the story of my career.
posted by octothorpe at 4:26 PM on December 23, 2008
Then I'll read up on what I've done later and realize what a miserable hack I am..."
Artw, are you sure that you're not me? That's pretty much the story of my career.
posted by octothorpe at 4:26 PM on December 23, 2008
octothorpe - 'tis a sad story of broken dreams and harsh life as a mercenary in the software contractor armies of the world that's brought me to the point where I'm at now.
These days I'm devolting a certain portion of my week maintaining a site that was built in ASP. Not ASP.NET, straight up ASP. I can't help feeling that this was not really the way things were meant to go...
posted by Artw at 4:35 PM on December 23, 2008
These days I'm devolting a certain portion of my week maintaining a site that was built in ASP. Not ASP.NET, straight up ASP. I can't help feeling that this was not really the way things were meant to go...
posted by Artw at 4:35 PM on December 23, 2008
But...practice could imply you were actually interested in deeply learning something so as to become a...*horrors*...Professional at something. And we know we can't have that kind of silliness.
posted by Thorzdad at 5:12 PM on December 23, 2008
posted by Thorzdad at 5:12 PM on December 23, 2008
> These days I'm devolting a certain portion of my week maintaining a site that was built in ASP.
devolt, |di'volt|, verb [trans], 1. give all or a large part of one's time to a revolting person, activity, or cause.
posted by sdodd at 5:22 PM on December 23, 2008 [4 favorites]
devolt, |di'volt|, verb [trans], 1. give all or a large part of one's time to a revolting person, activity, or cause.
posted by sdodd at 5:22 PM on December 23, 2008 [4 favorites]
Knowing what's possible is half the battle.
Worrying about what's possible is the beginning of the end.
posted by StickyCarpet at 5:23 PM on December 23, 2008
Worrying about what's possible is the beginning of the end.
posted by StickyCarpet at 5:23 PM on December 23, 2008
In theory, there is no difference between theory and practice, but in practice, there is. Discuss.
When I was 9-12, this was my favorite "joke". So I ran it past my 9 year old the other night. He paused and said "...then that means that in theory there IS a difference between theory and practice."
As for the essay: Practice is actually the easy part of learning something new. The hard part is failing a bunch of times before you even get that far. The Go proverb is "Lose your first 50 games as fast as possible." For writers it's "Write one to throw away."
And that's for something where you even have a teacher to show you what you should be first failing and then practicing. If you are inventing a technique, first you have to find something that might work. Then fail at it. Then practice it.
posted by DU at 7:02 PM on December 23, 2008
When I was 9-12, this was my favorite "joke". So I ran it past my 9 year old the other night. He paused and said "...then that means that in theory there IS a difference between theory and practice."
As for the essay: Practice is actually the easy part of learning something new. The hard part is failing a bunch of times before you even get that far. The Go proverb is "Lose your first 50 games as fast as possible." For writers it's "Write one to throw away."
And that's for something where you even have a teacher to show you what you should be first failing and then practicing. If you are inventing a technique, first you have to find something that might work. Then fail at it. Then practice it.
posted by DU at 7:02 PM on December 23, 2008
Also, being thrown in and then reading up after is a great way to learn--in some ways much better than first reading and then attempting. (The best is to read, try and then read again.) By reading after, you see the explanations of the techniques or tools in light of the problems you faced and you're like OOOOOooooh, I see why they do it that way now.
posted by DU at 7:09 PM on December 23, 2008
posted by DU at 7:09 PM on December 23, 2008
I was going to make a Malcom Gladwell vs. Sideshow Bob joke, but it's already been done.
posted by cjorgensen at 7:09 PM on December 23, 2008
posted by cjorgensen at 7:09 PM on December 23, 2008
I dunno. It sounds a lot like "get off my lawn!"
Coming from a guy who made his fortune writing Unix manuals, it seems there's a bit of bitterness that the command-line is being marginalized and left behind, especially as the enterprise moves away from general purpose servers and routers and towards single-purpose appliances, usually configured and administrated with a GUI.
People don't want to practice formulating regedit strings or de-reffing pointers, because there are a gazillion more interesting things that could be done and essential things that need to be done with the time. People shell out a cash for a nice Guitar Hero controller, because pretending to be a rock-star with a few friends with the investment of a couple hours is more fun than actually spending years learning how to play well enough to bore people to tears with "Three Eye Blind" covers at parties, which you'll stop being invited to, anyway.
Computers are at their finest when they do all the scut-work so you don't have to. This includes programming, and it's shameful how deliberately difficult and opaque modern programming environments are.
Most programming languages, syntactically, are stuck in the 70s. Any time you need to memorize what the whole number-row of punctuation does and what bracket goes where to get anything done, you've failed. (LISP doesn't get a pass, because its formatting is insane.)
Why can't we have truly visual programming, with drag-n-drop subroutines and functions and objects, editing the schema like editing a pic in photoshop? Why can't we have an IDE that only requires an X-Box controller to code science and finance apps? Why can't we compile from a flow chart?
Limiting the programmer to the keyboard and an unbelievably primitive written-language simulation seems wasteful and needlessly difficult.
Practice is good... but practicing useful things without having to worry about practicing the stupid and inefficient stuff is sound practice.
posted by Slap*Happy at 7:27 PM on December 23, 2008 [2 favorites]
Coming from a guy who made his fortune writing Unix manuals, it seems there's a bit of bitterness that the command-line is being marginalized and left behind, especially as the enterprise moves away from general purpose servers and routers and towards single-purpose appliances, usually configured and administrated with a GUI.
People don't want to practice formulating regedit strings or de-reffing pointers, because there are a gazillion more interesting things that could be done and essential things that need to be done with the time. People shell out a cash for a nice Guitar Hero controller, because pretending to be a rock-star with a few friends with the investment of a couple hours is more fun than actually spending years learning how to play well enough to bore people to tears with "Three Eye Blind" covers at parties, which you'll stop being invited to, anyway.
Computers are at their finest when they do all the scut-work so you don't have to. This includes programming, and it's shameful how deliberately difficult and opaque modern programming environments are.
Most programming languages, syntactically, are stuck in the 70s. Any time you need to memorize what the whole number-row of punctuation does and what bracket goes where to get anything done, you've failed. (LISP doesn't get a pass, because its formatting is insane.)
Why can't we have truly visual programming, with drag-n-drop subroutines and functions and objects, editing the schema like editing a pic in photoshop? Why can't we have an IDE that only requires an X-Box controller to code science and finance apps? Why can't we compile from a flow chart?
Limiting the programmer to the keyboard and an unbelievably primitive written-language simulation seems wasteful and needlessly difficult.
Practice is good... but practicing useful things without having to worry about practicing the stupid and inefficient stuff is sound practice.
posted by Slap*Happy at 7:27 PM on December 23, 2008 [2 favorites]
Not sure why, but this anecdote seems appropriate...
In the early years of this century, Steinmetz was brought to General Electric's facilities in Schenectady, New York. GE had encountered a performance problem with one of their huge electrical generators and had been absolutely unable to correct it. Steinmetz, a genius in his understanding of electromagnetic phenomena, was brought in as a consultant -- not a very common occurrence in those days, as it would be now.
Steinmetz also found the problem difficult to diagnose, but for some days he closeted himself with the generator, its engineering drawings, paper and pencil. At the end of this period, he emerged, confident that he knew how to correct the problem. Steinmetz was asked what his fee would be. Having no idea in the world what was appropriate, he replied with the absolutely unheard of answer that his fee was $1000.
Stunned, the GE bureaucracy then required him to submit a formally itemized invoice.
They soon received it. It included two items:
1. Marking chalk "X" on side of generator: $1.
2. Knowing where to mark chalk "X": $999.
posted by ZenMasterThis at 7:36 PM on December 23, 2008 [2 favorites]
In the early years of this century, Steinmetz was brought to General Electric's facilities in Schenectady, New York. GE had encountered a performance problem with one of their huge electrical generators and had been absolutely unable to correct it. Steinmetz, a genius in his understanding of electromagnetic phenomena, was brought in as a consultant -- not a very common occurrence in those days, as it would be now.
Steinmetz also found the problem difficult to diagnose, but for some days he closeted himself with the generator, its engineering drawings, paper and pencil. At the end of this period, he emerged, confident that he knew how to correct the problem. Steinmetz was asked what his fee would be. Having no idea in the world what was appropriate, he replied with the absolutely unheard of answer that his fee was $1000.
Stunned, the GE bureaucracy then required him to submit a formally itemized invoice.
They soon received it. It included two items:
1. Marking chalk "X" on side of generator: $1.
2. Knowing where to mark chalk "X": $999.
posted by ZenMasterThis at 7:36 PM on December 23, 2008 [2 favorites]
my $5 was to fix some awful PHP code.
posted by zengargoyle at 7:37 PM on December 23, 2008
posted by zengargoyle at 7:37 PM on December 23, 2008
Who needs practice? I know Ruby and its a self-documenting language. Watch this...!
/grind
posted by ryoshu at 7:52 PM on December 23, 2008
/grind
posted by ryoshu at 7:52 PM on December 23, 2008
Why can't we have truly visual programming, with drag-n-drop subroutines and functions and objects, editing the schema like editing a pic in photoshop?
It's interesting to note that electronic design has moved in exactly the opposite direction. Nobody does digital design using schematics anymore, it's all done in a text editor.
posted by ryanrs at 8:17 PM on December 23, 2008 [5 favorites]
It's interesting to note that electronic design has moved in exactly the opposite direction. Nobody does digital design using schematics anymore, it's all done in a text editor.
posted by ryanrs at 8:17 PM on December 23, 2008 [5 favorites]
Most programming languages, syntactically, are stuck in the 70s. Any time you need to memorize what the whole number-row of punctuation does and what bracket goes where to get anything done, you've failed.
Or you can go back to the 50s and use something with a lot less punctuation like COBOL.
Why can't we have truly visual programming, with drag-n-drop subroutines and functions and objects, editing the schema like editing a pic in photoshop? Why can't we have an IDE that only requires an X-Box controller to code science and finance apps? Why can't we compile from a flow chart?
People keep trying stuff like this and it doesn't usually catch on, with some niche exceptions like LabVIEW. It sounds like a nice idea but a lot of programming is in the details, and a visual programming environment can actually make that more difficult.
posted by grouse at 8:21 PM on December 23, 2008 [1 favorite]
Or you can go back to the 50s and use something with a lot less punctuation like COBOL.
Why can't we have truly visual programming, with drag-n-drop subroutines and functions and objects, editing the schema like editing a pic in photoshop? Why can't we have an IDE that only requires an X-Box controller to code science and finance apps? Why can't we compile from a flow chart?
People keep trying stuff like this and it doesn't usually catch on, with some niche exceptions like LabVIEW. It sounds like a nice idea but a lot of programming is in the details, and a visual programming environment can actually make that more difficult.
posted by grouse at 8:21 PM on December 23, 2008 [1 favorite]
Slap*Happy-
For software to behave, it must be specified. For software to be specified, decisions must be made. Those decisions may be made implicitly by the language environment, or explicitly by the programmers.
High-level languages are useful because they make certain decisions implicitly that most programmers would like someone else to take care of anyway. But what of those decisions that the programmer must make? Should those decisions be specified textually or graphically?
In my opinion, the best answer to that depends on the number of decisions to be made. If there are few decisions to be made, then graphical methods should be fine. But if there are many decisions to be made, then textual methods may be better. This is because textual methods can reliably represent many more decisions within a particular amount of screen space.
Of course, one could use variations on the level of individual pixels to specify decisions. But you don't actually want to program that way; it would no longer be intelligible. Instead, decisions should be specified in discretely-identifiable chunks. Claude Shannon had a name for those... ah, yes, "symbols". Thankfully, most of us have a board with about a 100 different symbols on it right in front of us; we can invoke these symbols with the motion of a finger or two at great speed. And each of those symbols has a pretty compact representation on-screen, too.
Could you explain why representing programs as flowcharts would be better?
posted by a snickering nuthatch at 8:36 PM on December 23, 2008 [4 favorites]
For software to behave, it must be specified. For software to be specified, decisions must be made. Those decisions may be made implicitly by the language environment, or explicitly by the programmers.
High-level languages are useful because they make certain decisions implicitly that most programmers would like someone else to take care of anyway. But what of those decisions that the programmer must make? Should those decisions be specified textually or graphically?
In my opinion, the best answer to that depends on the number of decisions to be made. If there are few decisions to be made, then graphical methods should be fine. But if there are many decisions to be made, then textual methods may be better. This is because textual methods can reliably represent many more decisions within a particular amount of screen space.
Of course, one could use variations on the level of individual pixels to specify decisions. But you don't actually want to program that way; it would no longer be intelligible. Instead, decisions should be specified in discretely-identifiable chunks. Claude Shannon had a name for those... ah, yes, "symbols". Thankfully, most of us have a board with about a 100 different symbols on it right in front of us; we can invoke these symbols with the motion of a finger or two at great speed. And each of those symbols has a pretty compact representation on-screen, too.
Could you explain why representing programs as flowcharts would be better?
posted by a snickering nuthatch at 8:36 PM on December 23, 2008 [4 favorites]
If anyone is inspired by this article to learn (or perfect) a programming language and their problem solving skills, I highly recommend Project Euler (previously), which consists of a series of problems, in ascending order of difficulty, that require clever programming to solve. I'm using it to learn Python.
posted by limon at 8:44 PM on December 23, 2008 [7 favorites]
posted by limon at 8:44 PM on December 23, 2008 [7 favorites]
Slap*Happy, those kinds of visual programming languages with drag and drop have been done -- I know, I did one for my PhD thesis many years ago. (Modesty forbids a link. Besides, I'm not too proud of it.)
The thing is, real-world problems are simply too large for visual languages. As the old joke goes, text-based programming languages are the worst kind except for everything else that's been tried. I'm sure one day we'll come up with a better model of computation that will require a new kind of programming, anything's possible. Don't forget that the development of language was the key step in human evolution, and written communication was the key advantage that allowed the Romans to become so powerful. My own opinion is that text-based programming languages are the ideal fit with the human mind -- I say that as a fan of LISP, Haskell, Prolog, and Icon. (And C of course.)
posted by phliar at 9:16 PM on December 23, 2008 [1 favorite]
The thing is, real-world problems are simply too large for visual languages. As the old joke goes, text-based programming languages are the worst kind except for everything else that's been tried. I'm sure one day we'll come up with a better model of computation that will require a new kind of programming, anything's possible. Don't forget that the development of language was the key step in human evolution, and written communication was the key advantage that allowed the Romans to become so powerful. My own opinion is that text-based programming languages are the ideal fit with the human mind -- I say that as a fan of LISP, Haskell, Prolog, and Icon. (And C of course.)
posted by phliar at 9:16 PM on December 23, 2008 [1 favorite]
Dijkstra had some interesting ideas about learning computer science. It's quite a timeless essay.
posted by breath at 9:25 PM on December 23, 2008 [4 favorites]
posted by breath at 9:25 PM on December 23, 2008 [4 favorites]
In theory, there is no difference between theory and practice, but in practice, there is.
True, but the difference between theory and practice is a lot bigger in theory than it is in practice.
On the other hand, the difference between the difference between theory and practice in theory and the difference between theory and practice in practice is a lot bigger in practice than it is in theory.
Then again, the difference between the difference between the difference between theory and practice in theory and the difference between theory and practice in practice in theory and the difference between the difference between theory and practice in theory and the difference between theory and practice in practice in practice is much bigger in theory than it is in practice.
However, the dif<HED ASPLODE/>
posted by erniepan at 9:39 PM on December 23, 2008
True, but the difference between theory and practice is a lot bigger in theory than it is in practice.
On the other hand, the difference between the difference between theory and practice in theory and the difference between theory and practice in practice is a lot bigger in practice than it is in theory.
Then again, the difference between the difference between the difference between theory and practice in theory and the difference between theory and practice in practice in theory and the difference between the difference between theory and practice in theory and the difference between theory and practice in practice in practice is much bigger in theory than it is in practice.
However, the dif<HED ASPLODE/>
posted by erniepan at 9:39 PM on December 23, 2008
Slap*Happy, take a look at Authorware as it is a fairly polished example of what you seek.
It was designed as a way to create courseware but is powerful enough to create decently complex desktop software.
Programs are designed visually, with a nested flowchart as the development area. Interactions, if/then decision trees, frameworks, etc. are all created visually by dragging icons into place.
Originally, in Authoware's design, the icons were the only way to create programs but later versions added "calculation" icons which allowed you to embed code for those frequent circumstances when the prebuilt interactions weren't enough.
Authorware was my first exposure to anything involving programming and I ended up working with it for around 10 years. Originally started out using just the icons for simple course development, then started adding snippets of code, and then by the end, was designing projects that were almost pure code and used relatively few icons for organizational purposes.
So you can see I ended up evolving / devolving to the 70s style you mention because the visual environment alone was too limiting for anything more complex than a simple quiz application.
However, something like Authorware is still terribly useful for learning concepts and ideas about how to build an application, especially for more visually-oriented people such as myself. And its incredibly easy to get started. I've seen some tech-phobic educators get a decent enough grasp of the language with just a couple of days of training to start building their own courseware. I can't think of any other environment which would have allowed this.
You should download the trial to see how this visual development works in practice.
posted by pandaharma at 9:50 PM on December 23, 2008 [1 favorite]
It was designed as a way to create courseware but is powerful enough to create decently complex desktop software.
Programs are designed visually, with a nested flowchart as the development area. Interactions, if/then decision trees, frameworks, etc. are all created visually by dragging icons into place.
Originally, in Authoware's design, the icons were the only way to create programs but later versions added "calculation" icons which allowed you to embed code for those frequent circumstances when the prebuilt interactions weren't enough.
Authorware was my first exposure to anything involving programming and I ended up working with it for around 10 years. Originally started out using just the icons for simple course development, then started adding snippets of code, and then by the end, was designing projects that were almost pure code and used relatively few icons for organizational purposes.
So you can see I ended up evolving / devolving to the 70s style you mention because the visual environment alone was too limiting for anything more complex than a simple quiz application.
However, something like Authorware is still terribly useful for learning concepts and ideas about how to build an application, especially for more visually-oriented people such as myself. And its incredibly easy to get started. I've seen some tech-phobic educators get a decent enough grasp of the language with just a couple of days of training to start building their own courseware. I can't think of any other environment which would have allowed this.
You should download the trial to see how this visual development works in practice.
posted by pandaharma at 9:50 PM on December 23, 2008 [1 favorite]
Revision control is much easier with text. This is hugely important in real-world projects.
posted by ryanrs at 9:55 PM on December 23, 2008
posted by ryanrs at 9:55 PM on December 23, 2008
I can attest to having relearned the concept of closures going on a dozen times now and it still eludes me in gaining concrete, recoverable knowledge and understanding. I get it, but I don't *get it*, know what I mean?
[it sucks to be a curious dullard]
posted by Jeremy at 10:02 PM on December 23, 2008
[it sucks to be a curious dullard]
posted by Jeremy at 10:02 PM on December 23, 2008
I suspect some people think text-based programming is an example of humans modifying their behavior to suit the machine. This is not the case. Language is a defining characteristic, perhaps the defining characteristic of human intelligence. It is the essential, universal means of communicating our thoughts.
I shudder to think where we would be without words.
posted by ryanrs at 10:10 PM on December 23, 2008 [1 favorite]
I shudder to think where we would be without words.
posted by ryanrs at 10:10 PM on December 23, 2008 [1 favorite]
Slap*Happy, everything you say is just made of wrong.
posted by mr. strange at 12:52 AM on December 24, 2008
posted by mr. strange at 12:52 AM on December 24, 2008
jeremy: the trick for me was implementing something like a closure in a language that lacked them with the idea of what a closure is fresh in my mind. As an aside/hint: most of the "object oriented" code out there is just closures and prototyping.
posted by idiopath at 4:43 AM on December 24, 2008 [1 favorite]
posted by idiopath at 4:43 AM on December 24, 2008 [1 favorite]
I'm using it to learn Python.
I used it to learn Python and then I used it again to learn Haskell. I wouldn't say I perfected either one, though. Or even mastered. Maybe familiarized.
posted by DU at 5:09 AM on December 24, 2008
I used it to learn Python and then I used it again to learn Haskell. I wouldn't say I perfected either one, though. Or even mastered. Maybe familiarized.
posted by DU at 5:09 AM on December 24, 2008
This is going to be a Scott McCloud "I didn't say anything" moment for some of you:
The computer, by way of the compiler or interpreter, does not understand words as words. It's a lump of doped silicon - it doesn't "understand" period, full stop.
What you do with programming - it's not language. Just like hitting the space bar to fire and the alt button to reload in a Flash game is not operating a gun.
A programming language is a metaphor humans use for themselves when controlling a machine, and one that seems long past its prime.
Shouldn't a language, by definition, you know, communicate? These language simulations are so poor and primitive, one programmer cannot figure out what another is doing without either extensive documentation or extensive study - worse, they can't figure out what they're doing themselves without a good debugger, lots of extra code that's otherwise useless except for debugging, and lots of trial-and-error testing. Hardly the leap forward in human evolution some would claim.
And don't get me started on the cruftified, obfuscatory mess mathematics symbology and process has become, or we'll be here all week.
posted by Slap*Happy at 5:38 AM on December 24, 2008
The computer, by way of the compiler or interpreter, does not understand words as words. It's a lump of doped silicon - it doesn't "understand" period, full stop.
What you do with programming - it's not language. Just like hitting the space bar to fire and the alt button to reload in a Flash game is not operating a gun.
A programming language is a metaphor humans use for themselves when controlling a machine, and one that seems long past its prime.
Shouldn't a language, by definition, you know, communicate? These language simulations are so poor and primitive, one programmer cannot figure out what another is doing without either extensive documentation or extensive study - worse, they can't figure out what they're doing themselves without a good debugger, lots of extra code that's otherwise useless except for debugging, and lots of trial-and-error testing. Hardly the leap forward in human evolution some would claim.
And don't get me started on the cruftified, obfuscatory mess mathematics symbology and process has become, or we'll be here all week.
posted by Slap*Happy at 5:38 AM on December 24, 2008
Shouldn't a language, by definition, you know, communicate?
A friend of mine says that "well-written code documents itself". I wonder if what it communicates depends also on the skill level of the programmer reading it.
posted by Blazecock Pileon at 7:08 AM on December 24, 2008
A friend of mine says that "well-written code documents itself". I wonder if what it communicates depends also on the skill level of the programmer reading it.
posted by Blazecock Pileon at 7:08 AM on December 24, 2008
There's a good article, No Silver Bullet: Essense and Accidents of Software Engineering, that gives arguments for why the type of thing Slap^Happy is looking for will likely never come about.
posted by Green With You at 7:52 AM on December 24, 2008
posted by Green With You at 7:52 AM on December 24, 2008
Slap*Happy, since you haven't really engaged with the responses to your previous missive, I'm a bit reluctant to keep responding to you but I will say this.
It seems you have some frustrations with programming, and want to blame it on the programming languages. But you don't seem to understand that there are some things about programming that are just difficult, and they'll be difficult no matter what language you do it in.
Mathematical notation is a good thing to bring up here. While I agree that it can sometimes be obfuscatory and used in inappropriate ways, it doesn't mean that lots of people would be capable of doing very advanced mathematics if only the notation were different.
posted by grouse at 8:02 AM on December 24, 2008 [1 favorite]
It seems you have some frustrations with programming, and want to blame it on the programming languages. But you don't seem to understand that there are some things about programming that are just difficult, and they'll be difficult no matter what language you do it in.
Mathematical notation is a good thing to bring up here. While I agree that it can sometimes be obfuscatory and used in inappropriate ways, it doesn't mean that lots of people would be capable of doing very advanced mathematics if only the notation were different.
posted by grouse at 8:02 AM on December 24, 2008 [1 favorite]
In writing code, I think, efficiency is generally the enemy of readability - and vice versa. Readable code, the kind that is "self-documenting", if you will, is code that makes what it does explicit, step-at-a-time, whereas every technique I can think of to optimize code involves doing more than one thing at once, or, worse yet, not doing something (because it's technically unnecessary), or not doing something right now (because it's more efficient to defer it, when it might not get done at all, like with lazy evaluation, or because it's more efficient to do it earlier, like a pre-computed table of lookups, or whatever), or using a more subtle algorithm, or a more convoluted data structure.
All of these techniques require the reader to understand much more about the context in which something is happening - the why, as well as the what, if you like - and that makes it harder to grok what's going on at a glance. Code that runs faster is seen as elegant, and in the sense of being more intellectually satisfying, it is - but, in my experience, it's very rare to find that more efficient, more elegant or more compact code is more readable or easier to understand. So how elegant is code that you can't maintain? There can be a lot of egotism involved in coding - the viewpoint that "hey, it was hard to write this code, it should be hard to read :)" - and that manifests itself as excluding the n00b, that the code is only understandable to an Ascended Master, and that if you can't read and understand it, well, f*ck you. The problem is that code is written once, but has to be read and understood by other programmers a thousand times.
posted by kcds at 8:14 AM on December 24, 2008 [1 favorite]
All of these techniques require the reader to understand much more about the context in which something is happening - the why, as well as the what, if you like - and that makes it harder to grok what's going on at a glance. Code that runs faster is seen as elegant, and in the sense of being more intellectually satisfying, it is - but, in my experience, it's very rare to find that more efficient, more elegant or more compact code is more readable or easier to understand. So how elegant is code that you can't maintain? There can be a lot of egotism involved in coding - the viewpoint that "hey, it was hard to write this code, it should be hard to read :)" - and that manifests itself as excluding the n00b, that the code is only understandable to an Ascended Master, and that if you can't read and understand it, well, f*ck you. The problem is that code is written once, but has to be read and understood by other programmers a thousand times.
posted by kcds at 8:14 AM on December 24, 2008 [1 favorite]
But you don't seem to understand that there are some things about programming that are just difficult, and they'll be difficult no matter what language you do it in.
That's the point - stop trying to do it with a language (that isn't a language at all, but a virtual-world simulation of a language. Done poorly.) and do it with something else.
To further irritate you with digression, let's assume that you can't create a useful, modern program with a tool that uses a Wii stick and 5.1 surround sound. (I'm not convinced.) If you're so hell-bent on keeping the pseudo-languages and the punctuation-soup approach to programming, why not smarten up the metaphor? Have the IDE interact with you, switching language on the fly - from a functional language when you need to do lots of parallel processing, to a simple text parser when you want to validate user input, to a macro language for getting the pieces to work together.
There is a ton of solid research and prior art on human interface, little of it applied to the act of creating a program. Progressive disclosure? Hello? The modern GUI text-editor/IDE is a nice step forward, with color coding, syntax checking and the like, but it's not being taken to the next level. Why not develop a new programming language system to work with modern IDE design?
And I profoundly disagree with you on the math part: computer languages are bad enough without having to translate an equation into perl and back again. All math should be expressed as pseudocode or actual code - but that's neither here nor there.
posted by Slap*Happy at 9:01 AM on December 24, 2008
That's the point - stop trying to do it with a language (that isn't a language at all, but a virtual-world simulation of a language. Done poorly.) and do it with something else.
To further irritate you with digression, let's assume that you can't create a useful, modern program with a tool that uses a Wii stick and 5.1 surround sound. (I'm not convinced.) If you're so hell-bent on keeping the pseudo-languages and the punctuation-soup approach to programming, why not smarten up the metaphor? Have the IDE interact with you, switching language on the fly - from a functional language when you need to do lots of parallel processing, to a simple text parser when you want to validate user input, to a macro language for getting the pieces to work together.
There is a ton of solid research and prior art on human interface, little of it applied to the act of creating a program. Progressive disclosure? Hello? The modern GUI text-editor/IDE is a nice step forward, with color coding, syntax checking and the like, but it's not being taken to the next level. Why not develop a new programming language system to work with modern IDE design?
And I profoundly disagree with you on the math part: computer languages are bad enough without having to translate an equation into perl and back again. All math should be expressed as pseudocode or actual code - but that's neither here nor there.
posted by Slap*Happy at 9:01 AM on December 24, 2008
That's the point - stop trying to do it with a language... and do it with something else.
With what? You keep railing against currently popular techniques without understanding that the alternatives you propose have been tried for decades. They haven't caught on in a big way, and there is a reason for that. Try Authorware or AppleScript or LabVIEW or Yahoo Pipes. You might like them (and they definitely have their uses) or you might begin to realize why they aren't a panacea. You definitely will notice the problems if you try to do anything complex.
Or you can ignore these existing attempts and spout off non sequiturs about how a programming languages "isn't a language" and how you want to program with 5.1 surround sound, whatever that means.
And I profoundly disagree with you on the math part: computer languages are bad enough without having to translate an equation into perl and back again.
Math is hard enough without having to do it in Perl. (And if your hate of memorizing "what the whole number-row of punctuation does" etc. springs from use of Perl, then you should stop using Perl.)
posted by grouse at 9:29 AM on December 24, 2008 [1 favorite]
With what? You keep railing against currently popular techniques without understanding that the alternatives you propose have been tried for decades. They haven't caught on in a big way, and there is a reason for that. Try Authorware or AppleScript or LabVIEW or Yahoo Pipes. You might like them (and they definitely have their uses) or you might begin to realize why they aren't a panacea. You definitely will notice the problems if you try to do anything complex.
Or you can ignore these existing attempts and spout off non sequiturs about how a programming languages "isn't a language" and how you want to program with 5.1 surround sound, whatever that means.
And I profoundly disagree with you on the math part: computer languages are bad enough without having to translate an equation into perl and back again.
Math is hard enough without having to do it in Perl. (And if your hate of memorizing "what the whole number-row of punctuation does" etc. springs from use of Perl, then you should stop using Perl.)
posted by grouse at 9:29 AM on December 24, 2008 [1 favorite]
Why not develop a new programming language system to work with modern IDE design?
Will I have to sacrifice all my speed, efficiency and precision with a keyboard and rely on mousing around, speaking in tongues and interpretative dance or whatever in this new system?
Once you know what you are doing, because you have taken the time to practice (which was kind of the point of the linked article), a keyboard usually blows everything else out of the water.
posted by ghost of a past number at 9:41 AM on December 24, 2008
Will I have to sacrifice all my speed, efficiency and precision with a keyboard and rely on mousing around, speaking in tongues and interpretative dance or whatever in this new system?
Once you know what you are doing, because you have taken the time to practice (which was kind of the point of the linked article), a keyboard usually blows everything else out of the water.
posted by ghost of a past number at 9:41 AM on December 24, 2008
there are some things about programming that are just difficult, and they'll be difficult no matter what language you do it in.
I have had similar discussions with my friends about the study of math. For decades, people have been trying to come up with "better" ways for students to learn math which, while interesting don't seem to have paid dividends in terms of getting more students to learn math more quickly and more easily. At some point, the language of mathematics is like a foreign spoken language-- it's difficult and you have to spend a certain amount of time doing it over and over again before you catch on, and there's no shortcut around this.
Slap*Happy is making a relatively common mistake of the reasonably intelligent-- coming up with an interesting solution to a perceived problem without realizing that things are the way they are for a good reason. We've spend a couple of decades on CASE tools, too, which, as Slap*Happy proposed, really did compile code out of flowcharts. They didn't catch on because they took more time to use than writing code did, with little advantage in terms of reusability and integrity. We write computer programs in text using made-up, highly-specified languages because it's easier and faster and better to do it that way compared to the alternatives.
posted by deanc at 10:11 AM on December 24, 2008 [5 favorites]
I have had similar discussions with my friends about the study of math. For decades, people have been trying to come up with "better" ways for students to learn math which, while interesting don't seem to have paid dividends in terms of getting more students to learn math more quickly and more easily. At some point, the language of mathematics is like a foreign spoken language-- it's difficult and you have to spend a certain amount of time doing it over and over again before you catch on, and there's no shortcut around this.
Slap*Happy is making a relatively common mistake of the reasonably intelligent-- coming up with an interesting solution to a perceived problem without realizing that things are the way they are for a good reason. We've spend a couple of decades on CASE tools, too, which, as Slap*Happy proposed, really did compile code out of flowcharts. They didn't catch on because they took more time to use than writing code did, with little advantage in terms of reusability and integrity. We write computer programs in text using made-up, highly-specified languages because it's easier and faster and better to do it that way compared to the alternatives.
posted by deanc at 10:11 AM on December 24, 2008 [5 favorites]
Maybe we should all learn Piet and be done with it.
posted by Monday, stony Monday at 10:17 AM on December 24, 2008
posted by Monday, stony Monday at 10:17 AM on December 24, 2008
Every so often I get forced into using some kind of visual tool to do stuff - recently for me it was doing some Cocoa development, where you need to use the UI editor to do hookups. And, as ever, I found it a complete pain in the ass that got in the way of doing things. I'm pretty sure that anyone who does Cocoa or iPhone dev work seriously finds ways to do things in code and never have to do hookups that way.
(Objective C was also spawned by Satan, but that's a different thing entirely... )
posted by Artw at 10:24 AM on December 24, 2008
(Objective C was also spawned by Satan, but that's a different thing entirely... )
posted by Artw at 10:24 AM on December 24, 2008
I'm pretty sure that anyone who does Cocoa or iPhone dev work seriously finds ways to do things in code and never have to do hookups that way.
No, doing it that way is pretty much the point of Cocoa.
posted by bonaldi at 11:11 AM on December 24, 2008
No, doing it that way is pretty much the point of Cocoa.
posted by bonaldi at 11:11 AM on December 24, 2008
Actually a lot of the early iPhone example code, particularly the apple stuff, seemed to do without it and hook things up programatically. I never did quite get the knack of how they were doing that though.
posted by Artw at 12:20 PM on December 24, 2008
posted by Artw at 12:20 PM on December 24, 2008
Slap*Happy, what you're saying is a bit like telling a writer, "It's inexcusable that english uses so damn many WORDS and so much GRAMMAR, and this artificial, restricted set of LETTERS." At some point, when it comes time to turn your awesome idea into something others can appreciate, your idea must be encoded into a series of symbols that have meaning. That's called a language.
A programming language is a series of symbols used to describe a computer program, and how it operates. If you want to describe your program using thumb-twitches with a Wiimote, that's cool, and that's fine. But you haven't eliminated programming languages, any more than someone who communicates in morse code has "eliminated language". You've just restricted the set of possible symbols to the number of different kinds of gestures you can make with a wiimote.
I'll agree that it's inexcusable that performing so many tasks with a computer requires learning a special purpose language. The solution is to identify the most common tasks and do the work of building the software that simplifies those tasks, rather than making people learn to program. Turns out, that's what the software industry does.
What you're saying, in essence, is that you want other people to implement your ideas for you. Again, nothing wrong with that -- but you're not talking about simplifying programming. You're just asking programmers to build more stuff for you.
posted by verb at 1:07 PM on December 24, 2008 [2 favorites]
A programming language is a series of symbols used to describe a computer program, and how it operates. If you want to describe your program using thumb-twitches with a Wiimote, that's cool, and that's fine. But you haven't eliminated programming languages, any more than someone who communicates in morse code has "eliminated language". You've just restricted the set of possible symbols to the number of different kinds of gestures you can make with a wiimote.
I'll agree that it's inexcusable that performing so many tasks with a computer requires learning a special purpose language. The solution is to identify the most common tasks and do the work of building the software that simplifies those tasks, rather than making people learn to program. Turns out, that's what the software industry does.
What you're saying, in essence, is that you want other people to implement your ideas for you. Again, nothing wrong with that -- but you're not talking about simplifying programming. You're just asking programmers to build more stuff for you.
posted by verb at 1:07 PM on December 24, 2008 [2 favorites]
> A friend of mine says that "well-written code documents itself". I wonder if what it communicates depends also on the skill level of the programmer reading it.
The best example is well written code is like a well written story.
Just as a well written story has a story arch, characters and a general structure to how it operates that as a reader you can pick up going through it, well written code has variables, threads and formating that makes it easy to understand what is going on and how things are working.
A well written program can be complex, and it will take someone with a higher skill level to be able to understand it, but there is a difference between someone with little experience going "I can see where this is going, I just don't fully grasp it" compared with someone with experience going "this entire chapter is against character and out of place for the tone of the story and makes it hard to know why it was even there, or what purpose it serves."
posted by mrzarquon at 1:07 PM on December 24, 2008
The best example is well written code is like a well written story.
Just as a well written story has a story arch, characters and a general structure to how it operates that as a reader you can pick up going through it, well written code has variables, threads and formating that makes it easy to understand what is going on and how things are working.
A well written program can be complex, and it will take someone with a higher skill level to be able to understand it, but there is a difference between someone with little experience going "I can see where this is going, I just don't fully grasp it" compared with someone with experience going "this entire chapter is against character and out of place for the tone of the story and makes it hard to know why it was even there, or what purpose it serves."
posted by mrzarquon at 1:07 PM on December 24, 2008
I wrote a program in Guitar Hero. It goes red red green blue yelloooooooooooow!
posted by electroboy at 1:09 PM on December 24, 2008 [1 favorite]
posted by electroboy at 1:09 PM on December 24, 2008 [1 favorite]
I wrote a program in Guitar Hero. It goes red red green blue yelloooooooooooow!Careful, on expert mode you have to figure out horizontal scaling.
posted by verb at 1:17 PM on December 24, 2008
Actually, the more I think about this the better it sounds. Maybe there SHOULD be a game that makes programming easy, just like Guitar Hero makes playing the guitar easy.
We could call it "Code Hero," and it would come with a wireless keyboard with six big colorful buttons. The game would take place in a hip 1999 era .com office, with beanbag chairs and iMacs and a hip marketing guy who gets you "gigs." When a client wants "software," you would have to hit the buttons on your keyboard in time with a changing requirements document, and if you miss the buttons your software would get buggy.
If you matched the pattern correctly, though, you'd start to gain 'hack power,' which you could trigger to get stock options. If you get to the end of the requirements document without missing too many keys, you ship, and there's a big release party and then you see fictional clippings from industry magazines talking about your new product.
It would be fun, and energizing, and not at all complicated and full of arcane syntax like programming is today.
posted by verb at 1:54 PM on December 24, 2008 [3 favorites]
We could call it "Code Hero," and it would come with a wireless keyboard with six big colorful buttons. The game would take place in a hip 1999 era .com office, with beanbag chairs and iMacs and a hip marketing guy who gets you "gigs." When a client wants "software," you would have to hit the buttons on your keyboard in time with a changing requirements document, and if you miss the buttons your software would get buggy.
If you matched the pattern correctly, though, you'd start to gain 'hack power,' which you could trigger to get stock options. If you get to the end of the requirements document without missing too many keys, you ship, and there's a big release party and then you see fictional clippings from industry magazines talking about your new product.
It would be fun, and energizing, and not at all complicated and full of arcane syntax like programming is today.
posted by verb at 1:54 PM on December 24, 2008 [3 favorites]
Actually a lot of the early iPhone example code, particularly the apple stuff, seemed to do without it and hook things up programatically. I never did quite get the knack of how they were doing that though.
Yeh, though that's because Interface Builder hadn't been updated for the iPhone yet. Now that it is, virtually everything that has a real UI is coming via IB. I know people who've done extensive testing, and there's a speed penalty, but it's fractional compared to what you'd expect.
It's not to say you can't get by without IB, but these people (not saying you're one of them!) who mistake it for a code generator are way off the page. It's integral to Cocoa.
posted by bonaldi at 2:21 PM on December 24, 2008
Yeh, though that's because Interface Builder hadn't been updated for the iPhone yet. Now that it is, virtually everything that has a real UI is coming via IB. I know people who've done extensive testing, and there's a speed penalty, but it's fractional compared to what you'd expect.
It's not to say you can't get by without IB, but these people (not saying you're one of them!) who mistake it for a code generator are way off the page. It's integral to Cocoa.
posted by bonaldi at 2:21 PM on December 24, 2008
One of the problems with "conversational" or visual programming is that it is hard to be formal in such systems, and formality is a _requirement_ of programming.
This is also true for math, and it is why both use symbols instead of a conversational style. The content must be precise and unambiguous, and the reality is that humans have found that precise symbol-based systems work best for this sort of thing.
Of course, you can have a simple, precise graphical language (think LOGO). But in order to even approximate the complexity of a "real" program, you would quickly have a ridiculously complex graphical system which would be much harder to understand.
Once you learn the symbols (in math, or programming which is really an aspect of math), these systems really aren't that arcane, and provide a clarity of specification which English cannot reach, and which graphical systems are not compact enough to make useful.
In addition, the higher-level languages do things in a generic way to make this sort of thing work, which necessarily means they can't be optimized for a very specific flow of code. So if you're writing a very lightweight application, this is fine. If you're writing a high-performance or massively scalable application, this does not work as well. But even a high-level language like, say, Python, is much more adapted to this than a graphical system could be.
posted by wildcrdj at 2:39 PM on December 24, 2008 [2 favorites]
This is also true for math, and it is why both use symbols instead of a conversational style. The content must be precise and unambiguous, and the reality is that humans have found that precise symbol-based systems work best for this sort of thing.
Of course, you can have a simple, precise graphical language (think LOGO). But in order to even approximate the complexity of a "real" program, you would quickly have a ridiculously complex graphical system which would be much harder to understand.
Once you learn the symbols (in math, or programming which is really an aspect of math), these systems really aren't that arcane, and provide a clarity of specification which English cannot reach, and which graphical systems are not compact enough to make useful.
In addition, the higher-level languages do things in a generic way to make this sort of thing work, which necessarily means they can't be optimized for a very specific flow of code. So if you're writing a very lightweight application, this is fine. If you're writing a high-performance or massively scalable application, this does not work as well. But even a high-level language like, say, Python, is much more adapted to this than a graphical system could be.
posted by wildcrdj at 2:39 PM on December 24, 2008 [2 favorites]
Ellen Ullman writing in 1998 on the dumbing down of programming.
posted by needled at 7:25 PM on December 24, 2008 [1 favorite]
posted by needled at 7:25 PM on December 24, 2008 [1 favorite]
I've thought a lot about visual programming. One idea I've seen that may be useful is Turing incompleteness. Essentially, by restricting the flowchart to a DAG, reasoning about local properties is sufficient to prove a program's properties. Not all visual programming languages support this idea, and traditional programmers may have difficulty instantly recognizing and coping with the barrier. However, this is exactly the sort of thing that spreadsheets do, and people you wouldn't ordinarily let anywhere near a compiler LOVE spreadsheets. Too much, in fact; spreadsheets are terrible in a couple senses: their organization hides what they calculate, and people force them into row based systems.
One such example of this kind approach done right (well, maybe just not super appallingly bad) is Yahoo Pipes. What I like about tools like Pipes isn't the deviance from textual approaches, but that it visually models the kinds of programs I need to, for example, monitor TokyoTosho's RSS feed and pull out fansubs for a specific show by a specific group. I don't need to read plagger's API and familiarize myself with perl to get the job done.
Sure, I don't need yahoo's GUI to write this program. But it quickly highlights the things that are missing in traditional UNIX pipes. Splitting and union come to mind as perfect corollaries that don't exist because it's a parsing nightmare. Tee sounds useful at first but it's nothing more than a debugging tool.
Technically, you don't lose any computational power under such a model: each processing node is free to be Turing complete; it's just that so long as they all terminate, so does the network of them.
posted by pwnguin at 10:43 PM on December 24, 2008 [1 favorite]
One such example of this kind approach done right (well, maybe just not super appallingly bad) is Yahoo Pipes. What I like about tools like Pipes isn't the deviance from textual approaches, but that it visually models the kinds of programs I need to, for example, monitor TokyoTosho's RSS feed and pull out fansubs for a specific show by a specific group. I don't need to read plagger's API and familiarize myself with perl to get the job done.
Sure, I don't need yahoo's GUI to write this program. But it quickly highlights the things that are missing in traditional UNIX pipes. Splitting and union come to mind as perfect corollaries that don't exist because it's a parsing nightmare. Tee sounds useful at first but it's nothing more than a debugging tool.
Technically, you don't lose any computational power under such a model: each processing node is free to be Turing complete; it's just that so long as they all terminate, so does the network of them.
posted by pwnguin at 10:43 PM on December 24, 2008 [1 favorite]
Shader programs can be expressed as DAGs. Still, people like loops.
posted by ryanrs at 11:06 PM on December 24, 2008
posted by ryanrs at 11:06 PM on December 24, 2008
Was playing with Microsoft's Speech Server using the Workflow Foundation. Has elements of "visual" programming, in that you draw a flow-chart, but is not entirely so in that you can double-click on each of the elements and some additional code like you would in a Windows Form, ASP.NET page or a WPF/Silverlight page.
We're quite intimately in bed with Microsoft at my firm [1], and I can definitely say there's a movement _towards_ a more visual style of development; WPF/Silverlight's VisualStateManager is a case in point, as are ControlTemplate's and Workflow Foundation. This, however, is mostly decidedly _not_ the future; in addition to the valid points others have raised on complexity, XAML is not Turing-complete as yet. Don't think that's the intent either; at least in the Microsoft world, we're all heavily entrenched in that model of "drawing" everything, and then adding gnyaan (knowledge in Sanskrit) to those boxes. (Again, this with the assumption that everything in XAML can be draw-able in IDE's; not a justified assumption in many cases, must be told)
For your non-Turning-complete mashup needs, though, do consider Microsoft Popfly or Yahoo Pipes.
--
[1] - Not me in particular; I consider myself platform-agnostic, but with a definite preference for MS technologies these days. Been a Schemer and was active in the local LUG in a different era.
posted by the cydonian at 6:53 AM on December 25, 2008
We're quite intimately in bed with Microsoft at my firm [1], and I can definitely say there's a movement _towards_ a more visual style of development; WPF/Silverlight's VisualStateManager is a case in point, as are ControlTemplate's and Workflow Foundation. This, however, is mostly decidedly _not_ the future; in addition to the valid points others have raised on complexity, XAML is not Turing-complete as yet. Don't think that's the intent either; at least in the Microsoft world, we're all heavily entrenched in that model of "drawing" everything, and then adding gnyaan (knowledge in Sanskrit) to those boxes. (Again, this with the assumption that everything in XAML can be draw-able in IDE's; not a justified assumption in many cases, must be told)
For your non-Turning-complete mashup needs, though, do consider Microsoft Popfly or Yahoo Pipes.
--
[1] - Not me in particular; I consider myself platform-agnostic, but with a definite preference for MS technologies these days. Been a Schemer and was active in the local LUG in a different era.
posted by the cydonian at 6:53 AM on December 25, 2008
Or, errr, what pwnguin said. As usual, didn't read _all_ the comments before posting. Still, think I should get points to mention the Microsoft-world perspective and PopFly. ;-)
posted by the cydonian at 6:57 AM on December 25, 2008
posted by the cydonian at 6:57 AM on December 25, 2008
it seems there's a bit of bitterness that the command-line is being marginalized and left behind
I learned how to use cut instead of awk last week! So much simpler!
posted by GuyZero at 10:10 AM on December 25, 2008 [1 favorite]
I learned how to use cut instead of awk last week! So much simpler!
posted by GuyZero at 10:10 AM on December 25, 2008 [1 favorite]
the cydonian- Yeah, TBH most of the Microsoft stuff you're going tow ant to find ways around doing things the visual way was well.
posted by Artw at 12:14 PM on December 25, 2008
posted by Artw at 12:14 PM on December 25, 2008
« Older King Billy 2008 | Grooming Gone Wrong. Newer »
This thread has been archived and is closed to new comments
The US Government should pay him to write a book on why saving money is a good idea and the recession will be over a lot sooner.
posted by GuyZero at 3:05 PM on December 23, 2008 [4 favorites]