This is R's Typepad Profile.
Join Typepad and start following R's activity
Join Now!
Already a member? Sign In
R
Recent Activity
The logic is that a lot of work consists of one-off data processing. The sort of stuff that Linux users tend to use awk, grep, sed, and miscellaneous bash one-liners for. Modern office software is generally fairly good at automating common, basic tasks, but anything more specific needs a more technical approach. Just knowing how to use regular expressions would make a lot of tasks much easier. So it's not so much "everyone should be able to program at a professional level" as "everyone should be able to perform basic tasks on Linux", though without the OS-specific focus.
Toggle Commented May 15, 2012 on Please Don't Learn to Code at Coding Horror
I absolutely hate the dynamic loading design. Ever since slashdot switched to it, I've been using their /archive.pl page. It's a nice idea in theory, but in practice it doesn't work. I have less friction when using a paginated page, because I always open the next 5 pages as new tabs so that they can load in the background. I can't do this with the dynamic design - I have to wait a second for each page to load. It's also a pain if I just want to skim the results, due to the breaks between pages. Proposed solutions: -give the user a means to control either the initial number/proportion of items loaded, so that someone who is going to read the entire thing can just set it to 100% and have it load for them -Actually have the pages load seamlessly by caching them beforehand. The browser shouldn't wait until you reach the bottom of the page to start downloading the next one; it should start the moment the first page is done, and download the 3rd the moment it starts to display the 2nd page. Or better yet, download the first 10 pages with the first, but don't display them until the user scrolls to them. There are two problems here: downloading all that data takes a while, and displaying all that data makes it unmanageable. Trying to deal with both at the same time only confuses the issue and neglects part of it.
Toggle Commented Mar 28, 2012 on The End of Pagination at Coding Horror
"Algorithms are for people who don't know how to buy RAM." That's all well and good until you: a) Need to run software on a minimalist system (think phone or netbook, or even a budget desktop [relevant if you're an organization trying to cut costs]) b) Need to write software that does some serious heavy lifting (database manipulation, simulations, some types of AI). c) Realise that RAM consumes 20% of an average desktop's power (this is most relevant if you're running a server) a in particular is becoming increasingly relevant - it's the reason MS had to recommend XP for netbooks until Win7 came out. Efficient code can be used in places that bulky code can't, and just cause your dev system has 24 GB of RAM doesn't mean that your users do. While realistically you can't optimise code til it's perfect, efficient code will always be valued by users for its speed, while the reverse is also true (consider how much crap Nero has gotten for being a 1+ GB CD burning package, when competing packages are <10 MB). This is all to say nothing of the emerging market that is netbooks and smartphones (tablets too) - getting more than 1 GB RAM in those isn't about to become common anytime soon, and they're usage is increasing.
R is now following The Typepad Team
Jan 21, 2011