This site will work and look better in a browser that supports web standards, but it is accessible to any browser or Internet device.

Whedonesque - a community weblog about Joss Whedon
"Plus, fire? Pretty."
11943 members | you are not logged in | 18 April 2014












August 03 2006

How to Totally Fake Being a Geek. Geekily thorough overview of all aspects of true geekhood, from math to Medievalism. Whedonesque creations are mentioned.

Pretend you like these: TV shows: Babylon V, Star Trek, Carnivale. When you meet a B5 fan, claim ST. Claim B5 to ST fans. Meet a fan of both, ask if they've seen Carnivale. No, Carnivale isn't true geek fare, but it's high-brow enough that you'll pass as one who has "burnt out" on the science fiction genre temporarily. Gain extra points by dropping "Buffy the Vampire Slayer" or "X-Files" into the conversation. Avoid "Sliders" and "Quantum Leap", they died on cancellation. Don't even claim affection for "Firefly", because Firefly is so supercool, even it's fans disown it for fear of being flamed by the other fans. It's like the name of a diety: never say it out loud.


Clearly, nerds are still in.

That's some great stuff :) I was going to make specific points, but I figured I come off geeky enough moderating a Joss-site. What does everyone think about the AMD-ATi merger? What will Intel and nVidia's reactions be? Inquiring minds want to know...
The AMD-ATi merger is a fascinating development, creating a massive producer of PC components. It makes sense, as AMD has an excellent reputation among gamers, and ATI has been neck and neck with nVidia for most of the past five years. Even if Intel and nVidia go the same way, Intel's appeal is more generic and would lead to two distinct entities: One focusing on gamers, one on the more general consumer.

So, do I qualify?

A fun read, and spot on in a lot of places. Though I never knew that Firefly was a diet. Something to do with mystery meals perhaps?
LMAO... that's hilarious (and true). Except I'll always proclaim my love of Firefly - I've been there from the beginning and followed the online communities almost daily since then, so I need not fear being flamed ;-) But of course, I'm already a geek, if someone tried to fake and claimed Firefly they'd better know what a Browncoat is.
"Don't even claim affection for "Firefly", because Firefly is so supercool, even it's fans disown it for fear of being flamed by the other fans. It's like the name of a diety: never say it out loud."
I've never seen that, but then again, I was a fan pre-Serenity.
Yeah, I've never seen that either. Browncoats are one of the most welcoming groups in fandom - we are actively trying to recruit more fans! I'm not sure where that "disown for fear of flaming" comes from.

And I've been a fan pre-DVDs. ;-) Back when it was actually on TV... wow...

More Geek Cred faking: Mention the number 42 as the answer to any question you are asked. Memorize a handful of Monty Python quotes (bonus for entire sketches) and repeat them randomly. And when it comes to anime, mention watching it subtitled and not dubbed - and never NEVER mention Dragonball.

[ edited by AnotherFireflyfan on 2006-08-03 18:15 ]
disown for fear of flaming

I think what they mean by that is fear of being flamed by fans of other shows. Depends what sites you frequent. There are places out there that made "Browncoat" a dirty word.
Am I the only one who saw the bit about esoteric hobbies and immediately thought, "I'm pretty sure cephalpods wouldn't leave footprints..."
delirium: Good point. I just avoid those places, they're full of trolls and overly ignorant jerks. But true fans... *true* fans and geeks don't flame.

But really... dirty word? Is it our enthusiasm that bothers people?
Err, guys ? I think he/she was kidding about the Firefly flame thing ;).

I think the AMD-ATi merger could be great. Not a chip-head but the sort of design stuff GPUs use for concurrent pipelining etc. sounds like it'd be pretty handy for a CPU manufacturer plus it'll give AMD more inroads on chipsets so that they have more control of the complete platform. And AMD have a reasonable history of openness which might mean ATi finally opening their drivers so that the Linux ATi drivers may actually not suck.

Basically it makes for stronger competition (especially at a time when Intel's also actually making some cool products) and stronger competition makes for all that great consumer friendly stuff like price-wars and added value packages (OK, great in the short term - let's just stick our heads in the sand about * fingers in ears La, la, la * unsustainably low chip prices and * la, la, la I can't hear you * massive corporate debt ;).

Funny article but i've got to take issue with two things in particular. 1) there's just no way, in any world i'm familiar with, that Java is cooler than C. Nuh-uh. It's like C with stabilisers. Maybe i'm getting old but an elegant piece of C code still takes just about anything else out the back for a good shoeing and 2) 'Wargames' is a genuine geek classic and David Lightman is still one of the best representations on film of a geek so to even mention it in the same breath as 'Hackers' or 'Short Circuit' is just ... not even wrong.

Yet more bluffer's guide suggestions:

Refer whenever possible to celebrities (especially those with some geek cred) by their name's TLA (Three Letter Abbreviation) e.g. JMS, SMG, DNA, RMS

As an alternative to '42' use 'a suffusion of yellow' as an answer to questions (especially those totalling more than 4). Still Douglas Adams (or DNA ;) just 'Dirk Gently's' instead of H2G2.

Use /. jokes in real life where absolutely no-one else will get them. e.g. to a Fox exec: In Soviet Russia, Firefly cancels YOU.

Be seen reading anything by Donald Knuth. e.g. his laundry list. Don't worry, no need to understand it, just hold the book without falling asleep. Geek cred will actually start to ooze out of your orifices you'll have so much ;).

[ edited by Saje on 2006-08-03 18:35 ]
Saje - many great points :) Not the least of which being ATi Linux drivers finally not sucking (not holding my breath, but then I'm running SLI, so what do I care ;)). ATi really needs to work on their chipsets, their xFire stuff has, to put it politely and succintly, been un-good (up til the newest one, anyway - p.s. - a dongle and separate master/slave cards? paging ATi, the early 90's/TDFX are calling and they want their technology back ;)).

You are absolutely correct re: C vs. Java - and the first person to mention C# gets slapped with any available lunch meats. Don't get me started on Wargames vs Hackers or Short Circuit (cringe).

But, hey they managed to mention Ubuntu on Veronica Mars, so...

re: TLAs, use them wisely, use them often, whether with hardware (Did you hear nVidia is launching a new IGP as the successor to the 6200 series?), celebrities/people of note (Do you like RMS or ESR better?), or just MOU (make one up).

re: 42/suffusion of yellow, etc. 'A fine red mist' is always a great reference as well, though not Douglas Adams. You could always mention 'Shada' from when Douglas was script editor for Doctor Who, however.

re: /. jokes -- Natalie Portman and hot grits? /waits for a domo-kun to attack (we're heading to FARK territory).

Read: William Gibson, Neal Stephenson, Neil Gaiman, etc. etc. etc.
AHHH! Let's not rehash the great faux-geek movie-slap fight of the 80s and mid 90's. This was a wonderful read, and I kept nodding and going "yeah that's true."
ETA: WOW I can't believe I wrote this post, feel free to ignore it. I can take software development a little too far, and sometimes, people get hurt.

Java may not be "cooler" than C, but I wouldn't want to write a web app in C, nor would I want to write a graphical application in C. With features like namespaces, runtime reflection, and built in security it's way more useable than C for many types of applications. Now, if you're Alan Cox (he's the one in the middle) then yeah, you use C.

As far as C# goes, it has some interesting language features, and some cool foundation classes (I think that's what MS calls their class library), but I wouldn't use it unless I could run my app in Mono as well as the MS runtime.

[ edited by Caleb on 2006-08-03 19:24 ]
RE seitgeist: I see you are skilled in the ways of slashdot. Those were the good old days... actually, now that I think on it, they kinda sucked. My favorite one from that era is "Is this freshmeat? This isn't news!"
OK, I remember where Knuth said that computer-controlled traffic lights would never happen, and I used to be an Assembler guru (C is for wimps). I seem to have the old IBM jargon dictionary burned into my brane. I go back beyond Monty Python, all the way to Firesign Theatre ("You've got the wrong number. I spell my name DANGER!") I was a fan of "The Man From U.N.C.L.E."

But..... I prefer Windows to Linux! My geek cred is gone forever.....
the first person to mention C# gets slapped with any available lunch meats

C# !!! Booga-booga!
Seriously, I just got my certification (MCSD) in Microsoft development. C# is one very cool language. Gotta stick up for it!
Interestingly, the schools have switched from C++ to Java one year after I took the AP Computer Science class. I personally prefer C++ due to being more comfortable with it, but you gotta love the portability of Java byte-code.

I'm not entirely sure why Java is seen as uncool - the CS students and teachers I've encountered all lavish praise upon it.

Other things to do: use acronyms developed for internet chat in verbal conversations. IE. Say LOL instead of laughing, or OMG, WTF, etc.

As for ways to lose all geek-cred: Refer to the Internet as a series of tubes

[ edited by AnotherFireflyfan on 2006-08-03 20:04 ]
OK, I remember where Knuth said that computer-controlled traffic lights would never happen...


Wow, I never knew that. Do you have a reference for that? I did a google for it but couldn't find anything.
I quote slashdot sigs all the time. My current favorite is "heck is for people who don't believe in gosh".
Java portability: write once, debug everywhere! BTW, C# uses the same concept (called IL) on which .NET does a just-in-time compile to machine code at run-time.

As for Knuth and the traffic light thing, it's somewhere in one of the volumes of "The Art of Computer Programming." Knuth used it as a programming example (maybe in discussing state machines? it's been a while since I read it). His comment was some sort of throw-away like "ignore the fact that it would never be practical." The books have been revised at least once, so it may even have been deleted in a revised edition.
Oh, MKM I was all ready with a big heap of "we're not worthy"s and then you blew it

I prefer Windows to Linux!

But, but ... why ? ;)

the schools have switched from C++ to Java

AnotherFireflyFan, makes sense since Java's easier to learn and a bit cleaner, IMO. To be honest, it's not that it's not cool (only COBOL is genuinely uncool ;), it's just not as cool. Early implementations were way too slow (though JIT/dynamic compilation has helped a lot there) and VMs were too buggy and to some extent the language is still trying to live that down. I've seen complaints that it teaches bad habits when moving (back ?) to languages without garbage collection but all languages have gotchas so i'm not sure that's really valid (same with primitive objects or lack thereof - if you really need a fully OO language then don't use Java but it's not a language killing issue, IMO).

(and the portability idea is great on paper, maybe not so much in the real world where it doesn't - or didn't - always work as well as it should, as MissKittysMom says)

Not really looked at C# though a mate thinks it equals sliced white bread on the gifts-to-humankind scale. Still, given their past record it's a slight worry that it may be yet another MS ploy to enable totally portable development across every single ... version of Windows (i.e. open standard language and they opened the CLI but AFAIK the base class library still seems to be on shakier legal ground). Fair play though, they're not lawyer-slapping any of the open source implementations of .NET so that bodes well.

That said, I mostly use toy scripting languages now anyway (been using Perl, started learning that other language after going back to a hastily slapped together script 6 months later and thinking 'WTF ?' one too many times) so YMMV and WTF do I know ;).

Fair point Caleb, pick the right language for the application. C for speed (of execution, certainly not development ;), Perl for quick and dirty text munging, Python for quick and dirty text munging that'll make sense in 6 months, Java for learning programming, Ada for missile guidance, COBOL for, err, umm, ASCII art on two-tone printer paper ? ;)

zeitgeist, You must be new here (are the hot grits naked and petrified ? It's the only way to eat them apparently ;).
"Open the pod bay doors, Hal"
"I'm sorry, Dave. I can't do that."
"What's the problem, Hal."

Sorry, couldn't resist ;)
I think the Firefly thing must have been a joke. From my experience, the Browncoat community have been extremely passionate and inclusive right from the start of Firefly, and eagerly anticipated and supported Serenity's release. Everyone is pretty friendly and don't seem to care about whether being a Browncoat is "fashionable" or not, they just love Firefly and Serenity and want to share that with other fans.

Plus, I don't think the Browncoat movement is as recognised as anything like Star Wars or Star Trek, and is probably not as well known by the general public as Buffy or Angel.

Although I have done my bit to try and spread the word and convert fans, I suspect if you were to randomnly sample a set number of people then very few would have even heard of Firefly or Serenity (at least that they can remember). It seems to be mainly Browncoats, or anyone who is very interested in TV shows and movies who actually know about it.

And although I wish it had reached a wider audience in order to ensure further adventures in the 'verse, I kind of like the fact that it's quite underground. Despite the large Internet presence, if you were to walk around your town wearing a Blue Sun t-shirt or something, I think very few people might know what it was. But when you do get someone who knows about it, then it's a really cool feeling, like you instantly have a connection through Firefly.
Wow...I don't think I've ever been so turned on by words I don't understand. All this programmer speak is so hot. I love smart people.
Ada for missile guidance

Probably the most frightening mis-use of a language I've ever heard of was using Smalltalk for embedded missle guidance. For those who are not Smalltalk-literate, it's a completely OO language that is also completely typeless with run-time binding. Whether or not a method will execute is completely unknowable until you try it.

C# looks a lot like Java but is fully OO. Single inheritance with interfaces; garbage collection; exception handling; namespaces; reflection; etc. It supports "generics" which are somewhat like C++ templates. C# is open to allow interaction with "non-managed" code (pointers, COM, all the old-world stuff). The security model is radically different from Java, and is based on the .NET security framework which incorporates authentication, authorization, and fairly fine-grained privilege constraints that can be based on either authorization or code modules.

[ edited by MissKittysMom on 2006-08-03 21:51 ]
Let me be the first to say: huh?
Let me be the first to say: huh?

Ummmm, did we wander off-topic?
Let's see. Greeks, computers, nerds, computers, missile guidance.....I think the boss lady is true once again.

Thanks, Caroline for keeping us honest.
By the way, the reason Ada is used for missile guidance is because it's guaranteed in the proper time slices. It's the only language the DoD certifies where everything runs in the proper order. I'm not explaining that well, but the point is that if you tell Java to do a few thing in a specific order, you never know when the CPU will lag your language or when the garbage collector will lag. This is fine for word processing, bad for nukes.
Yep, as well as type-safety, some program correctness checking stuff and I think you can throw an exception if unallocated memory is accessed (and so catch buffer-overflows at run-time). Been a long time since I used it though so details are very sketchy (did it at uni then learned C the next term which may be one reason I liked C so much - it was like being released from prison after Ada ;).

It might be a B&D style language but that's exactly what you want when moving missiles around the sky (unlike Smalltalk's potential 'Avoid school: Method not implemented yet, sorry' approach as scarily outlined by MissKittysMom).

That said, maybe you want unreliable missiles if they've been fired at Serenity. Wait a minute, is that a topic I see before me ? (and a smoother seque never was seen. Fact ;).
Saje, we understand, but please don't bring it in this room. Not in this way.
Not quite sure I understand you Madhatter. If it's a moderation issue then I don't think Caroline was actually marking us Off-topic more like signifying amused bafflement (along with some Buffy quotey goodness ;) but either way i'll leave it to her to let me/us know and then respect her decision as and when.

(unless you're just really averse to buffer-overflows in which case, sorry dude, because they're everywhere ;)
Wow, there's a lot of programmer types at whedonesque! Would have a mailing list or something :)

BTW Python rules! Just had to say that.
Here are some fine geeks and nerds : here, here, here, and, of course, here.

There are even cruises for geeks, and they must eat specially-prepared food.

Those interested (nerds) may take scientifically-designed examinations to establish their nerd-creds, and determine geek levels.

Just the other day, I was talking with my partner about nerds vs. geeks vs. dorks vs. melvins, etc., and said that, you know, I was clearly a geek, but not a true nerd, and not a dork, I thought, but a little bit of a melvin, and he said, "You realize, don't you, that just by bringing this up, you are all of them?"

God, he's such a dweeb.

"I'm a science fiction geek from birth -- that's just who I am." -- Joss,
60 Second Interview by Andrew Williams - March 2, 2006
I wasn't moderating. I was just going 'huh, whatchall talkin' bout?'. I couldn't program my way out of a hat. I can fake my way through xhtml and css, though.
Oh come on, someone obviously has to step up for Java. It's fun to write, looks understandable, and I can't be the only one to prefer nice iterators and collections over writing your own handy dynamically linked lists ;-) Ok, I think I just dated myself re: C. C++ is ok-ish, too, but the syntax is strange and off-putting.

...presses DEL to deliver.
...don't even get me started on how inaccurately computers and programming were represented on Buffy. Especially that scene, while amusing. (Pressing del would have simply deleted a character after the cursor. And I don't think Cordy and Harmony would take a programming class, let alone be able to complete a program just by taking a long time on it and trying really hard.) The computer as represented by the show was more magic than any of the actual magic they did. Willow typey-typey and hacks right into whatever information they need. (Then again, it could also be argued that they browse through books and always manage to find the exact demon they're looking for, too. They're really just devices to further the plot along).

Anyone notice how I pushed this on topic? I should get a cookie or something. ;-)

[ edited by AnotherFireflyfan on 2006-08-04 13:00 ]
Anyone notice how I pushed this on topic? I should get a cookie or something. ;-)

Do you have third-party cookies enabled?
Personally I'm upset I get no cred for knowing Basic. Would it help if I said I learned to programme in Basic in the 70's?
But it is ok. For some reason, I've got mediaevalism going for me. And of course, Buffy.
Hey, Lioness, Visual Basic .NET is fully equivalent to C# and Java. It's just slightly more verbose about doing it.
Right there with ya, AnotherFireflyFan! However, some suspension of belief is possible in Buffy, since I can easily buy Willow as super-geek computer guru (and what a shame that she "forgot" most of that nerdiness in later years).

But what about Cordelia in Angel --- she was able to find anything - and I mean anything - from the net. It was just leetle bit too convenient. Oh --- she must have learned all that hacking from Willow's computer classes! It all makes sense now.
Or maybe Googling for shoe bargains ? ;)

I haven't seen many shows/films that have realistic computer scenes, from super-fast 'any key will do' typing to massive fonts that only let you fit about 10 words on screen at once to 'hackers' that can access any system by typing stuff like 'Access system: 001', almost nobody gets it right (or even tries to because, in fairness, programming ? Not that cinematic). For sanity's sake it's best to let it go ;).

(kudos to aforementioned 'Wargames', 'Sneakers' and 'Matrix Reloaded' for at least trying - OK, the Matrix exploit wasn't real but the vulnerability was, albeit pretty old and I use nmap all the time so I too could be Trinity if I simply started wearing PVC cat-suits. And became a woman ;)
I've always thought the funniest cinematic computer inaccuracy is that they always seem to have a GUI to represent how far a 'virus' has penetrated a computer system. (Granted, it adds tension and allows viewers to visualize... but has anyone really thought about how ridiculous this is? "Oh no! Another Box turned red! Type FASTER!!")

LOL... yeah, how'd Cordy get so good at the Googling? I still have trouble finding things sometimes!

Personally, I really love TI-Basic. (That's programming on TI-83 graphing calculators). Sure, it's a poorly conceived language with major limitations (all variables are global, for instance. Oh, and you only get about 25K on the calculator to work with.) - but in many ways the limitations add to the challenge. Really, I just like being able to program when bored in classes. ;-)

Basic is a good starting language. I learned programming in QBasic (and later VB). But once you learn C++ you never want to go back (unless you want to implement a GUI in Windows, then VB is simplest).

There really needs to be a movie for programming geeks - cause I actually think it IS exciting stuff (despite being 'not cinematic'). I guess you need to be able to follow it better than the average layman.

Edit: HA! I claim comment number 42 in the name of all things geeky. ;-)

[ edited by AnotherFireflyfan on 2006-08-04 15:51 ]
I think I remember Willow tracing the location of where the gnome cam was sending it's signal to, the computer skills were extremely unusual. It looked more like a computer game, where if she pressed a button at the right time when a big line was moving across the screen then she was able to do it.

Personally I didn't really have a problem with Willow's computer skills as described, because it is conceivable that someone could manage to do most of the things she did (although I doubt all of the information she accessed would be hanging about on the web- like schematics of the sewer system). But I just think that whenever they had to show it visually then it usually didn't work as well. I could tell that some of it was rubbish even though I'm not the most computer literate person out there. However it did work fine as a plot device.

And regarding the research the Scooby Gang did, I again found it fairly believable. Giles obviously would have had a good idea of what sort of stuff was in each book, so if for example they had a case of people being burnt to death, then Giles could narrow it down to certain types of demon or ghost or whatever, so then they would just have to go through the relevant books. I suppose he could have organised them by category as well. And it did seem like research often took up a lot of time for the characters, even with a team of them working. I think they would all have probably gotten more familiar with the books and how to use them. And of course there were a number of times where they got the wrong explanation or just couldn't find what they were looking for- like in "Smashed". Although it does occur to me now when I think of random episodes how frequently they managed to obtain vital information from books.
Damn, 42nd comment AnotherFireflyFan ? Well, at least I posted comment number 'a suffusion of yellow' ;).


"The Code Supremacy"

INT. Fluorescent lit cubicle farm. PAN TO

One particular cubicle. Littered about the desk are toys, jars of play-doh and a collection of old calculators. A badly singed George Lucas doll sits to the side, numerous pins sticking out of it.

Sally Love Interest: What's Sheila Bad Programmer up to ?

Dirk Good Programmer: Wait a second, OMG, she's only trying to index past the end of the array FFS. The fool, doesn't she know we go gold in 18 hours ? Hang on let me just ... yep, edit the source ... commit changes ... OK, i'm linking ... compiling ... c'mon, c'mon, Dammnit why wouldn't my tough but fair boss requisition a faster machine ? OK, done. Now if I can just execute the code in time ...

Sally Love Interest: In time for what Dirk ? Since we have 18 hours there's actually no urgency at all is there ?

Dirk Good Programmer: Oh Sally, bless your blinkered view of reality, don't ypu see ? It's the 200th episode of Stargate: SG-1 at 8 sharp. Jack O'Neill's back so I must see it live.

Sally Love Interest: You know it's recorded right ?

Dirk Good Programmer: err, ... it's live when they record it though, right ? Anyway, none of this is getting that code run, stand back Sally...

SLOW MOTION shot of hand reaching for 'Enter' key, maybe ramp the music up a bit here, it is the big hero moment after all. Finger moves down and ... PRESS !

CLOSE ON: blinking cursor by a normal, blessedly non-apocalypsy command prompt. Not a single error has been reported.

MUSIC REACHES TRIUMPHANT CRESCENDO.

Sally Love Interest: Oh Dirk, you code the best !

Dirk Good Programmer: We all have our gifts Sally. Here let me take off those glasses ... and let your hair down ... My God Sally, you're beautiful.

Sally and Dirk gaze into each other's eyes finally clinching in a long passionate kiss.

Credits.


Yep, i'm convinced. There should never ever be a film made about programming (now we know why there's loads of programming books but not that many programming videos ;).

(I could live with the library research because we were often shown how long and boring it was, flicking through a bunch of books to find the right reference, the computer stuff on the other hand - even fairly hardcore hacking - seemed to take approx. 2 seconds most of the time. Still, artistic licence and all that)
Saje - did you write that, or is it from somewhere? Either way LMAO! And why is it that in cinema geeky girls need to "take off their glasses and let their hair down" to become sexy? *rolls eyes*

I'm sure there are some fascinating true programming stories, though. Maybe not so cinematic. But interesting none-the-less. I just wish there was something that could give the general public a better idea of what programming actually is (ie not a bunch of flashing colors). Kinda like how Done The Impossible gives a good idea of what being a Browncoat is. ;-) [yay... topical]

The way computers are treated on Buffy/Angel doesn't really ruin the experience for me, the writing is so good you can look past the creative liberties they take. Well... except for I Robot You Jane. That one was just plain bad - the plot too heavily relied on false notions of how computers/internet works.
No, I just made it up so the somewhere it's from is the deep dark recesses of my mind. Things were kind of quiet at work this afternoon (does it show ? ;).

I do know what you mean but I think it'd be pretty hard to capture the thrill of solving a problem or producing a particularly elegant approach to something on screen since it's largely intellectual. Hell, even bug hunting can be cool from a puzzle solving perspective (especially when you finally track the little sod down ;) but it's hard to convey to non-programmers.

And I also didn't really have issues with the technical details in Buffy since the show in no way hinged on it. Character and emotional resonance were always much more important to the creators and we're all better off as a result.

(as you say, if everything else is good enough then small slips are fine - especially when sometimes the 'slips' are deliberate choices for good cinematic reasons - and, IMO, if the technical issues are really obvious to the viewer then there are deeper problems with the show anyway since a good narrative should make you willing - even eager - to suspend disbelief)
Wow. Yeah, you really do have too much time on your hands. ;-) But I think what you wrote is too funny to be lost forever in a Whedonesque comments thread - perhaps you should consider saving it or posting it somewhere?

If anyone ever figures out a way to get the non-programming-inclined people to understand even things like what programs are it would be an amazing breakthrough. My grandma thinks that everything is "Microsoft" or "eBay" (even when referring to completely different programs) and I can't even get her to understand what a word processor is, let alone how programs work. :-S

To me programming is simple logic, that's what's so great about it - it makes sense. Moreso than life does, because programs actually follow logic rules where life diverges all over the place. (hmm... I wonder if that has something to do with why I love programming so much...)
sage, that scene absolutely needs to be posted elsewhere and continued on a serial basis, perhaps on Flickr or .org? I can't wait to hear more about the adventures of Dirk Good Programmer and Sally Love Interest and their programming adventures. Is Sheila Bad Programmer going to be a recurring villain or is there a new one every week? Inquiring minds want to know!

Thanks for all those links, quotergal. I loved the cat and dog hiearchy article - it even made sense!

I am in awe of the knowledge on this board. After reading the original article about how to 'fake' being a geek, I thought I really didn't need to fake it, but now I'm reconsidering. My knowledge of programming is very basic (yes, that was on purpose), but I am actually interested in hearing more, even though I was with Caroline on the 'huh?"
Thank you Saje :D Loved it!

This discussion reminded me of this ye olde site from the days the internet was young...
Computer cliches.

If writing/acting is good and there are no huge gaps in the story, I don't mind the computer related stupidities in film and tv. Occasionally I smirk. Sometimes I cough (like couple of days ago when I saw in an episode of Prison Break how a Bad Guy traced the location of the Good Guys from instant messaging with handy graphical tool).

If there are no redeeming qualities, I can be merciless. Re: Alias!
You are totally welcome, samatwitch -- I loved the cat-and-dog theory, too (reminded me a little bit of Isiah Berlin's Fox and Hedgehog metaphor, and I made sure that there were some Joss-y surprises in some of the links for those that would read them.

"The other interesting thing is that Joss Whedon, either by accident or design, managed to stuff up every single relationship in the first few seasons. Buffy and Angel are cat/cat, Xander and Cordelia are cat/cat, Willow and Oz are dog/dog. The resultant painful reshuffling made the series rather interesting. He even needed to invent a new 'vampire with a soul' of the correct type, Spike.

Note also that this explains why Buffy gets on well with Giles but not with Wesley. Faith however finds Giles's leadership style unsatisfying.

As far as I can tell, in both the Buffy and Angel series, the cat/dog relationship rules allow you to predict the outcome of any given relationship perfectly." -- from the geekish logarithmic's "Cat and Dog Hierarchy theory.

Whoops, out of the loop in a weekend stylee (which means this may never see the light of day).

I find the dog/cat stuff pretty interesting and it also has the ring of truth though I think almost everyone plays both roles depending on the situation (i'm also not sure i'd peg Xander as a cat since he doesn't seem to propose many solutions but then he's not really the analytical, evaluating dog type either).

As for Dirk and Sally, despite a solid critical reception lukewarm box-office has jeopardised their sequel chances ;-).

(glad if it provided a few chuckles though)

[ edited by Saje on 2006-08-06 23:15 ]

You need to log in to be able to post comments.
About membership.



joss speaks back home back home back home back home back home