This is Mario Figueiredo's Typepad Profile.
Join Typepad and start following Mario Figueiredo's activity
Join Now!
Already a member? Sign In
Mario Figueiredo
Portugal
Recent Activity
I would also like to add: The Future Microsoft is renowned for introducing technologies, methodologies and processes that are hailed as the greatest thing on earth for developers, only to remove them 5 or 10 years later. It's become very hard to trust Microsoft when trying to develop a long term project. Because of this, the costs Microsoft introduces to code maintenance are unnecessarily greater than other solutions. Not to mention the fact you've been studying and specializing in a programming language, only to see it being removed a few years later. Necessarily, we don't expect much of this happening to web technologies. Microsoft knows the impact this could have on its already poor server percentage on the internet. But it's really very difficult to trust Microsoft technologies to any Open Source project one wants lasting and as easy to manage as possible. I've been a Microsoft solutions developer all my life and frankly am getting tired of it. At the age of 43, seeing all the technologies constantly being being removed and added to the .Net platform is too much to bear. There's a deep feeling that whatever time I invest studying the new will be wasted when it becomes the old in just a few years.
Toggle Commented Mar 25, 2013 on Why Ruby? at Coding Horror
Above I mean "Doug also brings up a good point". Sorry for the typo.
On this I can finally agree, Jeff. It's a very good post too. Jeff also brings up a good point that in a way you left out. Identifying the right details to be dealt with in our software is of primary importance.
I'm not sure I agree with the idea the ebay website is complicated to use. I'm not an ebay regular and yet I don't find it that difficult to use. I just find it ugly. Not hard. In any case, the whole premise of the article is sketchy. It's a given that a simplified version of a website will be... simpler. What's to tell? How we get from there to "will apps kill websites"? The tablet and smartphone market isn't even comparable to the computer desktop market. It's just a fraction of it.
Toggle Commented Apr 24, 2012 on Will Apps Kill Websites? at Coding Horror
Good to know this thing exists, but there is absolutely no way I would ever use this. I will keep relying on strong passwords. God forbid it comes the day that to use basic services on the internet, strong passwords aren't enough and we have to start using ever more complex methods to safeguard our privacy or property. It will just mean the whole internet as failed. Now, for those who don't use strong passwords this may be useful. But if that's the case, this just seems to be an upside down solution, where the least knowledgeable have to go through the biggest loops to secure their email account. That's no way to provide a security service. It's begging not to be used. The only people I see having a real interest in this feature are those for whom gmail is a mission-critical service through which they pass on sensitive information. But then one must question the wisdom of using gmail for mission-critical sensitive data. And on top of that having to disclose to the service company a phone number. Technically I think this is an interesting concept. Something an academic might smile at. It's however practically useless.
Toggle Commented Apr 21, 2012 on Make Your Email Hacker Proof at Coding Horror
The whole "read the source code, luke" is fine. News at eleven. Same with the whole "comments probably suck". But the emphasis should always be on correct and informative comments and on correct and informative documentation. Always demand the source code for your stack, but demand even more comments and documentation.
Toggle Commented Apr 17, 2012 on Learn to Read the Source, Luke at Coding Horror
I'm sorry, but just no! eBooks are simply inferior solutions to paper books in almost all regards. They are a convenience an should be respected for that. But I have books from my childhood and books my father left me and his father left him. My oldest book is a late 1600 edition of Os Lusíadas, beautifully made. There's absolutely no way, no way whatsoever, technology can guarantee backwards compatibility over a couple of decades, much less across generations. My eBook collection is probably as good as dead in 100 years. There's so much I will be able to guarantee in terms of format updates to keep up with whatever new technologies replace old ones. But my oldest paper book is over 3 centuries old. Been on the family over many generations.
Toggle Commented Apr 13, 2012 on Books: Bits vs. Atoms at Coding Horror
And don't forget scrypt (http://www.tarsnap.com/scrypt.html) which does tend to offer a bit more protection against brute force attacks.
Toggle Commented Apr 6, 2012 on Speed Hashing at Coding Horror
I've been a victim (and I find that wording correct) of hellbanning before. It was known to me pretty soon simply because I had a friend on the board, which on that very same day eventually lead to me to find my posts weren't being seen by him. The reason I got hellbanned on this particular news site was because I was somewhat critic of the journalist integrity of one of the staff members. I wasn't abusive in any way. Neither I was motivated by some kind of trolling behavior. Just, if you will, one of those difficult members of the community who may sometimes become inconvenient. Now, what this technique revealed to me is that this puts a lot of power in the hands of whatever moderators/admins are in charge of a community. The fact this is unknown to anybody allows for all types of abuse. Including quieting down inconvenient voices. That one may guarantee they will never abuse this type of ban, serves very little purpose. Motivations play a significant role in thwarting initial good intentions into seemingly correct actions that are no more than abuses of power. Our brain is very good in entering defensive mode when we are performing bad actions and finding justifications. It can be ultimately said that bad people don't necessarily feel they are doing bad things. Good people find it even easier. For any community praising itself of following a democratic role, this presents another problem. Democracy isn't simply a set of values around the principle of equality. It comprises too principles of Justice. Not giving someone even the right of being warned that an action was taken against them is not democratic. It's dictatorial. I can understand that some limits may be imposed (like not giving the person a chance to defend themselves) due to time, personnel or technical circumstances. But there's something fundamentally wrong about a community that includes a mechanism that punishes bad behavior without informing the punished of this decision. I could never support hellbanning. I find it vicious and, since we are discussing this in terms of a community that is being spoken of as democratic, immoral.
Toggle Commented Jun 5, 2011 on Suspension, Ban or Hellban? at Coding Horror
Silent updates work for some type of programs and some type of updates. They cannot be a catch-all solution to software upgrading. When Jeff says "To achieve the infinite version, we software engineers have to go a lot deeper" my only thought is to advise caution. You really don't want to go there. You'll quick understand that when you pretend to alter the program existing functionality or interface and smack that on the face of unsuspecting users. May even work for a browser (if you don't care for the 30% that will be annoyed because they actually don't like your changes but can't go back). Where it definitely won't work is for most productivity tools; like office suits, where your precious silent update can in fact disrupt users work. Manual and automatic upgrades will remain a necessity. It's not evil. It's in fact beneficial and desirable for many types of software.
Toggle Commented May 24, 2011 on The Infinite Version at Coding Horror
I was in Canada at the time, along with a Canadian friend trying to launch our startup. We weren't affected by the bubble in the sense we saw our stock devalue. We were instead starting to launch our hunt for VCs. So our business didn't even got a change to start because everyone was backing out. In retrospect, we were just another "all talk, no substance" buzz-riddled business of the time, and given we were getting ready to not put personal protection plans on the table (relying exclusively on our wages), it's good we came late and didn't get a chance to start. As for the issue itself, I like to think that bubbles serve the good purpose of restructuring the market around realistic concepts. It's in the aftermath of these events that the market becomes sane and healthy. Before that it is governed by hunches and a great dose of smart-talk. Finally, I don't think that graph reveals a bubble in the works. It looks a lot like business-as-usual. If anything there's a steep climb from the 2008 stock market decline back to the values that preceded it.
>> But programmers, like yourself, are uneducated. "Optimization" is what you do to an already optimal algorithm, to speed it up in a given platform/hardware. And with this you pretty much denounce your own lack of education. Not just the fact you choose to insult people you disagree with, but also the fact your obviously do not really understand what you are talking about. If you choose to call code optimization "what you do to an already optimized algorithm", I feel obliged to point you to this: http://en.wikipedia.org/wiki/Code_optimization In there you will hopefully learn that choosing the appropriate algorithm is also part of the optimization procedure. But contrary to what you want to indicate, the choice of an appropriate algorithm is not always straightforward since often you'll find yourself compromising on other aspects of performance; is it faster but uses more memory? Is it slower but has a small footprint? Since these algorithms are proven, you don't have many chances (if any) of optimizing them any further. Yet, when you are faced with these questions, the decision must be made on what you need to compromise. And then there's those algorithms you design yourself for those non-generic needs, where the level of optimization is decided by you as you are developing. When will you stop optimizing your algorithm? When you are satisfied with the results? Or when you cannot do any more optimization? If the former, congratulations! You made your way into the real world of professional programming. If the latter, unless there's a concrete reason to spend your time with your algorithm (e.g. you are coding a performance-critical application), you will hardly find anyone sympathetic with you. Much less your boss. And its in this context that quote fits in. RAM, CPU, Hard-Drive, all can contribute for better performance in ways that your coding skills cannot by the simple reason the programming language semantics limit what you can do. It's not an invitation to write bad code or to make poor decisions; it's the reassurance that once you write good code and make good choices, the hardware will do more for your application than any extra bit of added performance you can extract from your code. I hope, if you choose to reply to this, you avoid the insults. The level of emotional response you put into this topic has the opposite effect than you think; It does not intimidate and does make you look insecure about your own thoughts.
There's more to performance than just what a programmer can do by optimizing its code. This is particularly important on that type of software which performance is also dependent on user input, or on software that is supposed to be scalable. So the quote "Algorithms are for people who don't know how to buy RAM" deserves a little more attention than what some of you guys have been giving it. It's not stupid, it's not the worst thing you ever heard. It's actually quite, in fact, true! There's so much a programmer can achieve in terms of code optimization, much of which is well documented and easily understood. There's very few secrets concerning code optimization these days. Particularly on well-known and proven areas of development. Any code displaying less than optimal optimization is pretty much understood these days as either a strategy (maintenance concerns, etc), laziness on behalf of the programmer, or inexperience. Not the product of some secret knowledge available only to a few. Optimization is, for the most part, taken away from us and put on the hand of the compiler, the operating system and the hardware. Those are the real agents of performance on our modern systems. Our saying (assuming of course good quality code) was pretty much limited. Now, with only a limited capacity for optimization, it's pretty easy to understand that hardware scalability comes into play in terms of what one can or cannot do to actually increase the performance of our software. Instead of relying oneself on wasting time over-optimizing our code to achieve some performance goal that may actually be unattainable, we greatly reduce costs and achieve much better results by increasing the capabilities of our hardware. And that's where this quote fits in. And who says RAM, says CPU, or any other relevant piece of hardware. A few microseconds attained by some very smart code optimization technique that took weeks to achieve and introduced new code maintenance problems, cannot ever replace the elegance and simplicity of an hardware upgrade. And will never compete with it on software that is meant to scale.
@Chuck, I really admire Blekko. It's been a few years since I started complaining about the vertical shopping list that search engines have become. I always identified the lack of categorized searching as one of the main problems (exactly the ability to trim my search results based on categorized information as opposed to search terms), but always bumped into a wall when I was asked how it would ever be possible for a search engine to categorize the web. The answer was obvious, it wouldn't be the search engine doing it. It would be the users. I just couldn't imagine how. Your slashtag solution is tremendously elegant. It's essentially a categorized search in the hands of the user, allowing them to trust a community effort to categorize search results, but also create their own which can be made private, if they so wish, for the maximum search results customization possible. This is the type of innovation that defined Google back in 1998. An innovation that I no longer expect from this company, which I predict will lose its dominance of the web search engine market sometime in the next 10 years because it is falling prey to the exact same vices displayed by the companies it displaced back in 2000 when it became a phenomenon; Google corporate nature slows down new developments and its commitment to the current winning strategies clouds their vision of future (and present complaints that are starting to emerge). Without competition, Google has been falling behind in user expectations and admiration, to the point of having become a common target of criticism. I'm not saying however you guys are the solution. I'd hope you to be because your current process really strikes a chord on how I personally see the web search requirements of the decade that is just starting. The SEO button and the /rank slashtag are also a boon that cannot be overstated. The fact you folks chose the angel funding venue also gives me some confidence in the ability of your project to stay afloat in bad weather. As far as I'm concerned, I'll do my part by using it as much as possible, create slashtags when needed, and essentially be part of what I hope to become a growing community. As I said I don't trust the current solutions as the ones who will take web search engine to new heights. They are becoming old and disconnected from their users requirements.
Toggle Commented Jan 4, 2011 on Trouble In the House of Google at Coding Horror
@Vasuadiga Well, it really isn't the domain name that is influencing the results. Along with that domain there's a legion of SEO techniques that are the actual responsible for the website placement. There never was, and still there isn't, any reason to believe the domain name factors in a website rank. Neither it would make any sense. What happens instead is that a domain name like iphone4case.com facilitates the creation of a link anchor text that may be more relevant to Google's algorithms (it is believed that a link anchor text is important). So with a domain like that, the owner is effectively creating a commercial name that goes like "iPhone 4 Case". Contrast that with the same business, had it been named mobileshell.com. When someone links to their business, the link anchor text and surrounding text could read as: - Find your iPhone cases at [u]iPhone 4 Case[/u] - Find your iPhone cases at [u]Mobile Shell[/u] On the first case, both commercial name and anchor text accurately reflect the business, whereas the more creative second option will however produce an anchor text that doesn't. So when searching for the company name, "Mobile Shell" may produce a lot of false positives with links to the military or engineering areas, whereas "iPhone 4 Cases" will not. On the other hand, when searching for the more generic term "iPhone cases", the first company is at an advantage because there's a real chance that the vast majority of anchor text that link to their website include these exact terms (the plural form is largely ignored by google).
Toggle Commented Jan 4, 2011 on Trouble In the House of Google at Coding Horror
Lets hope they do and things do get improved, Matt. There's been a growing disconnect between Google Search and its users for the past... couple of years, I'd say. To the point that previously very rare statements like "Google search engine isn't good anymore" are becoming more prevalent. Something that would be unthinkable before. Being that this is also the period in which Google introduced the most relevant new features and changes to the search engine UI since its inception, maybe it's time (and excuse me the bluntness) Google realizes that may not be what users actually require the most. I'm prepared to accept also we are simply a non representative minority. But I do seem to witness a growing cry of protest. With alternative search engines taking their place in the market offering competitive possibilities, all care is not enough. Remember how Google itself rose. And my congratulations, BTW!
Toggle Commented Jan 4, 2011 on Trouble In the House of Google at Coding Horror
@Matt Cutts I can see you took the time to read, analyze and post a comment. That's very decent of you. Unfortunately I can also see you only addressed Jeff, ignored any comments from commenters in here and approached the matter purely as a ranking issue. Since Google Search is meant to be a service to the "user who searches" and not a service to the "user who publishes", I'm unsatisfied by your comment. But not surprised.
Toggle Commented Jan 4, 2011 on Trouble In the House of Google at Coding Horror
>> I've been dying for a "never show me results from this site again" button in Google's search results. Indeed. A content-based algorithmic search with the addition of user tools should be the way to go. I'd really like to manage my search results, and I don't mean in the way of voting for links like Google has implied sometime ago with their social searching services. That won't solve the problems and will introduce new ones (like social engineering or regional/cultural encroachment). The current model is becoming expired and the "market" of content consumers is becoming less relevant in Google search results. This was once the great novelty of Google and what elevated them to their present status.
Toggle Commented Jan 3, 2011 on Trouble In the House of Google at Coding Horror
Moving away from content-based ranking feels scary to me. I'd rather have things stay as they are. Or, if you will, give the social search engine as an optional approach to enrich the algorithmic search. I feel the problem with Google however is not the algorithms, but the absence of essential information that can no longer be ignored; i.e. Google has to stop presenting results as a veritable shopping list and seriously consider the introduction of categories into its search engine. This much was attempted by Cuil and was my favorite feature of that otherwise failed attempt at producing an alternative to Google. Backed up by intelligent algorithms like Google is capable of doing, scrapers wouldn't be able to avoid being moved to their own category away from normal searches.
Toggle Commented Jan 3, 2011 on Trouble In the House of Google at Coding Horror
Funny how an article about sysadmins and programmers ends up in a few reader comments noting users as dumb, uneducated, and stupid. Guess it's the only thing we both seem to agree with. It's also the very thing we are both wrong about. As for the article, as a programmer myself, definitely agree that programmers shouldn't have access to production servers. Ideally, in-house code should even be implemented on production servers as any other 3rd-party code; i.e. built and packaged. And any in-house project maintenance should follow the same procedures for 3rd-party code through more or less formal downstream bug reporting. But lack of personnel, usual high maintenance requirements of in-house code, and always very short response times requirements, mean this "ideally" is rarely feasible. Programmers end up having to shortcut their way into the production servers for faster identification of causes or implementation of patches/fixes, or sysadmins end up having to script their way into a responsive and functional program while they wait for a proper fix from the developers. So, I look at Jeff's post more as a warning. A gentle push. A reminder that, while not entirely possible under most circumstances, we still should always try to move to an environment where programmers stay away from production servers and sysadmins don't mess with development/test servers. The sin is not in failing to accomplish this, but in getting used to it.
Now that you mention it Jeff, I feel that Phill's behavior more closely matches programming methods than marketing testing. Code, Test, Debug, Correct, Code, Test, Debug, Correct... Never thought of associating Groundhog Day to anything much really. But It's an interesting thought you have, although I think it stretches a little the boundaries of what is A/B testing.
@Murray Macchio, You obviously quoted an April Fools' joke a few hours behind the schedule. It's just not possible to alter a program behavior by changing pixels on the screen. ---- As for the article... "Do-It-Yourself" and "Made Easy" already put me off. Thanks but no thanks. I understand the importance of usability testing. But if it only takes a book with 168 pages to explain to me how to do usability tests that are after all "Easy" and "I can do them myself" then it becomes obvious I don't need the book. If it's so easy and I can do it myself, the only reason we don't do it is because we are either lazy or are having an hard time doing usability tests on tight schedule projects. Why buy a 168 page book to tell me something I already know?
Toggle Commented Apr 3, 2010 on Usability On The Cheap and Easy at Coding Horror
Mario Figueiredo is now following The Typepad Team
Apr 2, 2010