Stuff that's interesting.
Topics Include things that I encounter in my life, so are likely to include Software Engineering, interview skills, gender stuff, and nifty doodads.
I'm gradually moving over to the Quora platform. You can find me at http://jenee.quora.com/!
Thoughts and Things
I REALLY liked the parallel it drew between kids with autism struggling with sensory overload and young Superman wrestling with his powers manifesting. I'm no expert, but I'd like to think that kids with autism who see the movie will identify with Superman. Whether or not that is the case, I think it might work to de-stigmatize the symptoms of autism for the rest of the neuro-typical world, by giving us a positive context in which to understand them.
Lois Lane (Amy Adams) was particularly well done, and is probably my favorite incarnation of her. I actually believed that she was a Pulitzer-winning journalist!
That being said, despite the pleasant surprise of a competent and interesting Lois Lane, Zach Snyder is still not particularly good at making female characters that have any more depth than a 13-year-old boy's fantasy girl, as evidenced by, oh, every single other female character in the movie (other than Martha Kent, who plays a tough-if-one-note Kansas Mom).
As a point of interest, this movie does not pass the Bechdel test, which asks "Does this movie feature two female characters, who talk to each other at least once, about something other than a man?". The closest we came to a conversation between women about something other than a man was when the female Kryptonian with the terrible dialogue told Lois Lane to put on a breathing mask, but for the life of me, I can't remember if Lois said anything in response. Regardless, even if it passes the letter of the test, it most definitely doesn't pass the spirt.
With this in mind, I would like to kindly request that Zach Snyder stay the hell away from Supergirl or Wonder Woman (and Batgirl, while we're at it). I don't want to know what his versions of those characters look like. I don't think I'd be particularly happy with the result.
I'm fine with Zach Snyder's Superman, though. His Superman may not go through a particularly dynamic character arc, but he has more depth than any other incarnation of Superman that I've seen. Whether that's due to the acting of Henry Cavill (who is WAY overdue for a starring role!), or to the script, I can't quite say. But I like him!
The most surprising performance was Russel Crowe's Jor-El. It was a surprisingly meaty role, and not in a bad way. I actually enjoyed watching all his Superman-less scenes! They really fleshed out the Krypton Lore, even if there were a couple of poorly explained points (or, as some of my friends deemed them, plot holes).
In conclusion: EVERYONE GO SEE THIS MOVIE SO THAT I CAN GET A JUSTICE LEAGUE MOVIE. Thanks
"Source Code" is the second directorial offering from Duncan Jones*, whose freshman offering was the FANTASTIC "Moon."
It is a thoroughly enjoyable movie, and everyone I've talked to agrees with me on that.
However, they often add the caveat: "But the title makes no sense. It's technobabble. Just ignore it."
I'm here to change your world, ladies: The title DOES make sense. And I'm going to explain it to you.
(There WILL BE SPOILERS ahead. Actually, it's all pretty much spoilers. If you haven't seen the movie, but plan to, don't read this. Unless you're cool with spoilers, I guess.)
The Necessary Programming Vocab
For those of you not familiar with programming terminology, get ready for a crash course! When programmers create software, they do so by writing source code (lower-case). This source code is the set of programmer-readable instructions that says how to build the program, and what it should do. The source code is the recipe, or the blueprint, for the finished application.
When the source code is finished, the computer compiles the source code into something called a binary**. This binary is a machine-readable version of the human-readable instructions found in the original source code.
To go back to our source-code-as-a-recipe metaphor: If the source code was the recipe, then the binary is the food that results when you follow the recipe. You can follow the recipe as many times as you like, and so you can have as much food as you'd like, but it's all the same.
If the recipe metaphor doesn't float your boat, the source-code-as-a-blueprint metaphor is just as, if not more, valid: given a blueprint, a set of contractors can build a house to specifications. In fact, they can build many houses, all the same, just with one blueprint.
Now, a programmer would normally uses source code to create binaries. However, with much difficulty, one can reverse-engineer a binary to see the underlying source code. This is generally a very difficult task, but it is possible. Think of it this way: a good cook can sometimes figure out a recipe to a food that they've eaten, though it may take some trial and error. And there is an entire profession whose job it is to create blueprints for already-existing buildings!
How these programming concepts apply to the movie
Okay. So. Here's how the movie "Source Code" actually fits its name.
The life you're living right now? The moment that you're in? That's the binary. You know what's happening, and you can interact with it, but you don't fully grasp the underlying mechanics and cause and effect of the world around you.
The movie asks us to accept the premise that the world around us is a binary, generated by some underlying "source code" that gives the world instructions on how the course of history should go.
The shadowy government agency in the film sort of figured this out. They figured out that they were living in a binary, and some crazy-smart guy figured out how to reverse engineer the "binary" of a moment in time to get the "source code" underneath.
They don't quite realize that what they've got is "source code". They think that they're just looking back in time--that the "source code" they found is no more than an exhaustive time capsule. Instead, the source code contains all the instructions required to put a new universe in motion. After all, to bake a batch of cookies from scratch, one must first create the universe.
Once they had the reverse-engineered source code, they presumably figured out some way to tweak the time parameter in the program, to rewind it a bit. Let's just say that the derived source code requires you to supply a "time" argument, and it can be whatever point in the past that you want it to be.
When a terrible attack occurs, the scientists change the "time" value in the source code, and then--they compile a new binary.
But, instead of just getting to relive the past, each time they create a new binary out of their source code (i.e. each time they send him back in time), they're creating an entirely new universe , parallel to the "prime" universe.
Mr Gyllenhaal figures this out, and that's why he goes into that last timeline to save people. So he can be a hero in one binary of the world, and live a normal life. He exists in that life independently of the "prime" universe, because the newly-compiled universes are entirely self-sustaining.
So. Now you know. NOW YOU UNDERSTAND. Now you can stop complaining about the title of the movie, okay? It's not techno-babble; it's REALLY well-thought-out, and it's technically sound fictional metaphor.
* Fun fact: Duncan Jones's birthname is Zowie Bowie. Yes, he's David Bowie's son. But with a name like that, you'd change it too, if you wanted to be taken seriously as a director.
**The term "binary" has multiple meanings in computer science. In this particular case, we're talking about an executable, binary file. Binary is also a counting system, and the language that computers speak. If you want to learn more about those, check out the Wikipedia disambiguation page: http://en.wikipedia.org/wiki/Binary
Apparently, there's been an explosion of coverage on Women in Geekdom and Tech. Apparently, the whole wide world of tech is catching on to the idea that there are women in geeky professions and hobbies, and that those women are, in fact, not alright with having to hide their gender online in order to avoid sexist comments.
It's about time. We've been here a while, you know.
My first indication of this trend was my male friend sending me a link to an article about women in tech, with a comment to the tune of "Woah! I just read this article, and I didn't realize there were so many women who weren't cool with how women are treated in tech. Have you read it?" In fact, I had read it, several months before, but I was surprised that he would have stumbled across it, given what I knew of his typical news sources.
My second indication that this was happening was via the Quora question Women in Technology: Why have "women in technology" issues rapidly increased in exposure/publicity in the last 12-24 months?
My third indication that this was a larger issue that'd thoroughly penetrated outside my fem-tech bubble was this blog post by John Scalzi (!!!):
Reader Request Week 2013 #9: Women and Geekdom
I really, really like him as a writer, and now, I like him even more as a person. Yay John Scalzi!
I really liked this quote from the comments, by @improbablejoe:
One of the “less invalid” complaints I hear from male geeks is that women are ruining geek culture by expecting “special treatment.” They argue that they insult everyone using sexist slurs, are are generally abusive to one another, and now here are all these women who expect to be treated differently? Why should geek culture change for them? Which, I mean… it has a certain internal logic and lack of self-awareness that is sort of impressive
I'm a huge proponent of user-driven design, but something occurred to me recently:
If you only make things that fit existing paradigms, you'll never make anything that's out of the box.
In usability, it's easy to fall into the trap of thinking that "make the computer work for the user" means "figure out what the user does, and mold the computer perfectly to their behavior."
Sure, it'd be less frustrating for the user to not have to change their habits. But if you don't make major changes to their productivity, what motivation do they have to change to your product? You're not going to make huge changes in the world if all you offer is incremental improvements on existing systems.
Take writing a novel: It used to be that you'd write it more-or-less linearly on a typewriter. Now, writers have the option of starting anywhere, going back and forth, inserting phrases, removing others. But I'd imagine that, to people who were used to typewriters, moving to computers would've been frustrating.
Forget novelists, even: a big use of typewriters was filling out pre-printed forms. With a computer, you don't have an actual piece of paper, so you can't line up the lines! That's a usability nightmare! Let's just forget about word processors, okay?
It's a delicate balancing act, trying to promote both innovation and usability. As engineers, it's our job to find the sweet spot.
You're not nearly as bad as people give you grief for being, and you're on my good side because you give me money for doing searches, but I'm rather disappointed in your dictionary. I mean, yes, "compile" means "to create something by compiling things," but that's kind of a silly definition, because it USES THE WORD YOU'RE TRYING TO DEFINE.
Like many people in the academic world, I like how LaTeX makes documents looks professional and clean, but I hate all the hassle that goes into using it. I mean, dagnabbit, when I'm writing a paper, I do not want to have to compile* it. In fact, I do not want there to be any possibility of syntax errors other than the kind that occur in the English language! I'm a fan of WYSIWYG: What You See Is What You Get.
I've long been a fan of the online LaTeX editor Scribtex, because it takes the double-compilation headache out of the equation, and allows you to work with collaborators. My only major complaint is that there are no line numbers on the source documents, and the error messages reference line numbers. But I liked it enough to PAY to use it to compile my thesis.
The people who built Scribtex recently built a new platform, ShareLatex, and I love it even more than I love Scribtex. It has side-by-side comparison, line numbers, and is overall a much better platform! I'm moving my thesis onto it today, so I can debug it easier.
My only complaint about ShareLaTeX is that, unlike Scribtex, it doesn't have version control on the free version. You have to pay either $8/month as a student, or $15 as a regular collaborator. But, since I was paying for Scribtex anyway, it's not a huge deal-breaker for me. Plus, they let you export your project as a zip file at any time, so it's not like you couldn't do your own version control, though it'd be a bit clunky. For free though, I can deal with clunky.
Time will tell if I stick with it, but it's pretty sweet.
* To clarify for those not steeped in computer culture, I mean compile as in "translate into computer language", not as in "gather materials".
I get a bit miffed when people get mad at me for waiting at crosswalks when it's "clear", but the DON'T WALK light is on. I mean, is it really going to kill you to follow directions? I hope you follow lights while you're in your car--why wouldn't you do the same while walking? Grrrr...
So today, when I happened upon this article on Digg, I was a bit frustrated. It made vast over-generalizations about people crossing the street, shoehorning said street-crossers into pre-set molds, the way a rookie pastor makes over-simplified analogies during a sermon.
So, your "bold person" just walks into traffic, expecting people and cars to kowtow to them? And that's *good*? Uh huh. Sure.
So, the "followers" wait until lots of people are disobeying the DO NOT WALK sign, and then follow people into the street? They're not "doing it because everyone's doing it." They're doing it because the odds of them getting hit when there's a whole crowd of people are a lot smaller, and they are making a risk assessment.
And the hate for people who follow street signs just baffles me. If the signs are there, someone did some sort of transportation study that helped with traffic flow, and I'll get across eventually without messing that up. What if a car turns a corner, sees their green light, and guns it? THEN WHAT ARE YOU GOING TO DO, YOU MEASURED-RISK TAKER!
...I know it's difficult to be an excessively patient person, but DANG. If a three year old can wait until the sign says WALK, so can you. Are you less patient than a three year old?
Curious cat that I am, I decided to take a look at the new "Facebook Gifts" offering. To my surprise, I ended up really liking the platform.
There were a few pleasant surprises (and inspired design choices) that really cemented my positive feelings:
On the buyer side:
1) Diverse selection: There are a LOT of gifts here. I could find something for everyone--my parents, my hipster sister, my boyfriend, my old boss, even my outdoorsy friend! They mix in traditional gifts (box of chocolates, flowers, watches) with "fun" gifts (like a Cat-Scratching toy that looks like a DJ turntable). Gift cards; Custom and novelty candies; wine; knick-knacks; makeup; flowers; stuffed animals; pet toys; spa days; gardening detritus; tea; cookies; magazine subscriptions, OH MY!
2) Reasonable prices, at multiple price points. Perhaps the pricing is just low to generate interest, but wow, these are some pretty nicely-priced gifts! At worst, they're what I'd expect to pay at retail (like $30 for a mechanical citrus juicer or $11 for a mullet-wig headband), and at best, they're way less than I'd expect to pay (like $10 for a Brookstone multitool or $7 for a mini birthday bundt cake)! I ended up going for the smiley face cookie from Cheryl's Cookies. You pay $5, free shipping, and your friend gets a nifty cookie and a $5 gift card for more cookies. Miracle of miracles, it's even a USABLE giftcard, since there are things you can get on the Cheryl's cookies website for just $5.
3) Awesome e-cards: Okay, so they're pretty basic. But with each gift, you get a bunch of eCards to chose from, which you can add your own message to. It allows you to put your own personal touch on the gift, and is pretty awesome :)
4) Recipient-action notification: One of my major concerns with this was that I'd send a Facebook Gift to someone that's never on Facebook Fortunately, you get a notification when the recipient opens the gift, and when the gift gets shipped. That way, if you send the gift, and realize it hasn't been seen by the recipient, you can call them and tell them to check their Facebook.
On the receiver side:
1) Address obfuscation: I don't want to put my address on Facebook for the world to see. When someone sends me a gift, I get their cute little card and message, and then it prompts me to enter my address. This address, as far as I can tell, isn't posted anywhere publicly, and is not shared with the gift-giving party.
2) Gift substitution: I had a friend send me a cookie gift, and when I opened it, it asked me if I wanted to swap out the gift for a different gift. Whaaa? Yes, indeed. It allows you to pick a similarly-priced gift, and will send you that one instead, for no up-charge (as far as I can tell). I substituted the $7 "Grace Cake" mini-bundt for my $5 cookie, since both came with the $5 gift card.
I haven't received the gift yet, so I'm not sure about the quality, but they've got a pretty good system going here. I think I'll keep using it.
Turns out, in 1956, a certain Howard L. Chace wrote an entire book of English translated into his new language, Anglish. This new book was called "Anguish Languish", which itself is Anglish for "English Language". Chace was a French professor, and though the book itself is playful, he wrote it, in part, to show how tone is almost as important to the meaning of words as the pronunciation is. Interesting thoughts, huh?
Anyway, it's a pretty nifty read! You can read the whole text here.
I got a new phone! But I'll tell you about that later.
For now, I want to tell you about Bump.
Bump is primarily known for its intended purpose--exchanging contact information via Near Field Communication. I have yet to use it for its intended purpose, but I'm sure it's fine for that.
I've been using it to transfer photos off of my phone, and it's SO EASY. Like, EFFORTLESSLY EASY. I'm in awe.
You just go to bu.mp on your laptop, then open the Bump app on your phone. You go to the photos tab in the app, then select a bunch of pictures. I recommend you try 15-20 at a time. I tried 111 at a time, and that didn't work wonderfully.
Once you've selected all your photos, you just tap your computer's spacebar with your phone. Then, your photos sync to the BUMP website, and from there, you can trivially download them onto your computer. AWESOME!
1-10 of 52