Read my latest article: Six days to complete the Rails hosting survey (posted Thu, 24 Apr 2014 14:36:00 GMT)

89 gmail invites available!

Posted by Fri, 27 Jul 2007 21:10:00 GMT

While everyone else is trading their pownce and skitch invites, I wanted to let everyone know that I still have 89 gmail invites available.

Post a comment on my blog and I’ll hook you up!

Happy Friday! ;-)

update: only 12 left!

YSlow and Rails performance: Getting UJS and AssetPackager to play nice

Posted by Fri, 27 Jul 2007 18:40:00 GMT

Yesterday, I started to dig deeper into YSlow and decided to pick an application that we recently launched for a client. The performance grade that I saw at first was an F, which wasn’t surprising to me because we knew that there was going to be some fine tuning in the near future.

There is a lot of JavaScript in this application and we have several files to break up stuff to make it more maintainable. However, in production, we really don’t need to send the client (browser) 19 different JS files. We’ve been using mod_deflate to compress these files, but it doesn’t solve the problem of having several connections opening to download all the necessary JavaScript. The same is true for our CSS files.

At RailsConf, DHH announced that an upcoming version of Rails would bundle all the stylesheet and javascript files into one file and compress it. We’re running on 1.2.x for this application and decided to look at the AssetPackager plugin as a good solution to this problem.

I installed the plugin via piston and ran the following task, which is provided by AssetPackager.

rake asset:packager:create_yml

This went ahead and created config/assets_packager.yml. I then went ahead and updated our capistrano configuration to call the rake task after updating the code on the server when deploying.


desc "all of our other tasks/commands to run after updating the code" 
task :after_update_code do
  #
  # all of our other tasks/commands
  #
  run "cd #{release_path} && rake RAILS_ENV=production asset:packager:build_all" 
end

The first thing that I noticed was that the yml file that gets generated will not make any assumption as to what order the javascript libraries should be loaded. So, immediately, line 1 of our compressed javascript file was causing an error as the code was trying to reference a library that hadn’t been defined yet (showed up later in the file). So, when you do this, you’ll need to organize the yml file to load things in order that they are needed. This was also a good opportunity for us to say, “oh, we’re not using that one anymore. Let’s remove it.”


--- 
javascripts: 
- base: 
  - prototype  
  - effects
  - scriptaculous
  - controls
  - dragdrop
  - application
  - slider
  - pngfix
  - nav
  - lowpro
  - lightbox
  - folder
  - builder

Great, so we re-dployed and everything at first glance seemed fine… or so we thought!

We used the unobtrusive javascript plugin for this project and it seems that we couldn’t just compress every single file. Each page has a behaviors javascript file and since everything was being compressed into one file (and cached), RJS calls quickly broke throughout the site. OH NO!

So, I opted to merge all of the other javascript files and use the standard way of including unobtrusive javascript in the application layout.


<%= javascript_include_merged :base %>
<%= javascript_include_tag :unobtrusive %>

We also removed lowpro from the list of javascript files to compress since the ujs plugin is currently including this when we call <%= javascript_include_tag :unobtrusive %>. I plan to look into modifying this so we it’ll only include the page-specific behaviors and not load the lowpro javascript file (so we can compress that as well).

Once this was re-deployed, we saw that the RJS issues were resolved and everything felt to be loading quicker. But, let’s look at YSlow again for step 1 in improving the performance of the application.

side note: the following grading was also after making some other adjustments that were suggested by YSlow, which I’ll discuss in another blog post soon.

So, where we once had a grade F, we now have an D… which is due to the client having us add several (four) external javascript files for mint, google analytics, etc. We’re only loading 3 javascript files for the application, when we were originally loading many more.

Obviously, there is some more tuning to be done, but we went from a grading of 43 to 74 in about three hours of time spent reading the YSlow documentation, adding asset_packager, and making various tweaks to our web servers (as suggested by YSlow).

Until next time…

Related Posts:

Rails Code Audit Tips - Filtered Parameter Logging

Posted by Tue, 17 Jul 2007 03:50:00 GMT

It’s been a month since I posted, Audit Your Rails Development Team and now I find myself sitting in a hotel room in Mankato, Minnesota with Graeme after a long day of walking through the documents that we delivered to our client after conducting a Rails Code Audit and Review. Our client felt that it would be a great idea to have us visit with six of their employees and walk through the various topics that we brought up in our process. We’ve been doing several of these audits recently and are thought that it would be a good idea to begin sharing some problems that we’ve discovered across projects.

As much as we like to find lots things that we’d recommend improving in Rails applications, we also want to make sure that as many projects as possible avoid some of these common oversights. So, expect to see more posts related to things that we find through our Code Audit and Review process.

Today, I’d like to point out a potential security problem that is often overlooked by developers and system administrators.

Log files.

Does your application request any of the following information from your users?

  • Social security number
  • Credit card date (number, expiration date, etc..)
  • Passwords

BY DEFAULT, all of this data is being written to your production log file. Even if you’re encrypting this data in your database, request parameters (get/post) are all written to your production logs without any encryption. Log files are also notorious for having insecure file permissions, so if you’re on a shared host, other accounts on the server might be able to view them. Regardless of how secure you think your server is, this isn’t data that you want sitting around.

Lucky for you, Ruby on Rails has an easy solution to this problem! All that you need to do is use the filter_parameter_logging method in your controller(s). We generally add something like the following to our application controller.


  filter_parameter_logging :social_security_number, :password, :credit_card_number, 'some-other-param' 

This will replace the value from the parameters with [FILTERED], which solves this problem rather nicely.

So, it would be our recommendation, that if your application is storing any sensitive data, that you take advantage of this simple solution. Be sure to read more about filter_parameter_logging before you implement this for various usage examples.

Stay tuned for more tips and tricks. If you’re interested in contracting us for our Rails Code Audit and Review service, give us a call.

Tomorrow, Graeme and I will be meeting for another day with our clients and then we fly home to Portland, Oregon in the evening. We survived our first tornado warnings, which was exciting as we don’t get those on the west coast. ;-)

RubyURL 2.0 on the horizon

Posted by Tue, 17 Jul 2007 03:23:00 GMT

RubyURL was a project that I built about 2 1/2 years ago as a late night attempt to see what I could build and deploy with Ruby on Rails in a night. It’s nearing 50,000 unique website links, has a Ruby gem that you can use with it, and rbot plugins.

I’ve rewritten it about three times in the past six months, to try out some new approaches, but haven’t deployed with a new version as I’ve been waiting for someone to help me with a new design. Chris has offered to help out and once we integrate his new design with it, we’ll be launching it.

Everything is not great in RubyURL land though. It appears that it’s become an easy target for comment spammers to abuse the site to generate rubyurls and paste those links in their spam comments. Several pissed off bloggers, forum administrators, and system administrators have emailed me to complain that I’m spamming their site. Sadly, even with a basic disclaimer on the site, they still like to blame me for their spam. It’s gotten common enough, that I’ve written a template email that I respond with that explains how the site works and that I’m not accountable for people posting links to my URL redirect tool.

You can see that it’s popping up around the net via a google search.

So, I’ve been trying to think of ways to make it easier for people to flag URLs as being abusive of the site. I’ve not come up with any elegant solution that doesn’t force the good users of the site to have more steps in their process to create a basic RubyURL.

The ideal (and current) workflow:

  • User navigates to http://rubyurl.com
  • User pastes in long url into text box/area
  • User submits form
  • User is provided with new (shortened) rubyurl
  • User copies the rubyurl and does what they want with it (generally… pastes into IM, IRC, Email, etc.)

Some people have suggested using a user system to do this, but I really don’t like that as a solution.

Another idea, which I built… and later removed from my new version, involved having the original url load in a frame, and then provide a way for users to flag it as ‘spam’, ‘nsfw’, or ‘dead’. Then, we could provide the user with a warning that the following URL was flagged before, are you sure you want to continue? I didn’t like this as a solution in this way as it felt very obtrusive to have a rubyurl frame at the top of the browser window.

One person suggested a captcha to try and verify that the user is human, but there are problems with this.

  • I really dislike captchas. ;-)
  • This doesn’t prevent spammers from using the ShortURL gem, which does everything via an API.

In regards to the API, this could be enhanced by requiring that everyone register an email address to get an API key, but only solves the API abusers.

I’m starting to brainstorm some solutions that specifically help the requests made through the web. I haven’t checked the logs enough yet to verify it, but I have a strong suspicion that much of the abuse is happening through a web-based bot, not through ShortURL… because Ruby developers are nicer than that. (I hope…)

So, I am curious… dear readers of my blog. How might you solve this problem without disrupting the user experience? Or, should I just stick with what I’ve got going and find a better way to respond to pissed off bloggers who think I’m spamming them?

Discuss…

Airplane customers prefer bland food...?

Posted by Mon, 02 Jul 2007 22:52:00 GMT

Earlier today, I was booking flights for Graeme and myself as we’re heading to Minnesota soon to visit one of our clients. While purchasing the tickets on Travelocity, I was glad to know that they remember that I’m a vegetarian. However, I almost made the mistake of submitting the form with the following meal preference for Graeme.

I found it amusing that this was the default in the drop down.