Tweet Button URL Double Encoding Fix

If you’re adding a Tweet button to a dynamic web page using the goodies page at

https://twitter.com/about/resources/buttons#tweet

then don’t rely on the “Use the page URL” option to supply the tweet url. widget.js will incorrectly encode your url if it has a query string with any % escape characters, and the link in your tweet will actually be broken.  Most people don’t encounter this problem (since generally you would put the Tweet button on a static page with no query string and hence no % characters), but if you do need to put a Tweet button on a dynamic page with an encoded query string in the url, you’ll need to physically specify the data-url param to the link tag with the current page URL.  Don’t leave it to Twitter to determine the current page url – they’ll incorrectly re-encode the url string.

There are a number of reports of this bug on the Twitter forums, but they don’t seem to believe the bug exists and apparently have not fixed it in years.  Just use the data-url param when in doubt and save yourself some hassle.

Django view shortcuts: render_to_response vs. render

If you’re using Django 1.3+, and find your code littered with

from django.shortcuts import render_to_response

def my_view(request):
...
  return render_to_response('my_template.html', my_data,
    context_instance=RequestContext(request))

you can replace that with a call to the render() shortcut, introduced in Django 1.3:


from django.shortcuts import render

def my_view(request):
...
  return render(request, 'my_template.html', my_data)

You should generally prefer render() over render_to_response() – with render_to_response(), if you leave off the context_instance param, Django won’t invoke any of the default template context processors.

See the docs for more details.

Slow Twitter API calls on Ubuntu 12.04 + Parallels VM

While working on some Python scripts on Ubuntu running on a Parallels VM, I noticed that calls to the Twitter REST api were incredibly slow.  I also saw the same problem when just using curl instead of Python. Oddly, it didn’t occur if I just accessed the url from a browser (Chrome). Even more oddly, it only occurs when accessing the Twitter REST api, not arbitrary websites.

The issue can be avoided by switching the VM from Shared networking to Bridged networking.  The problem and workaround is consistently reproducible for me.

Dev setup:

  • MacBook Pro
  • Parallels Desktop 7.0
  • Ubuntu 12.04
  • Python 2.7.3

Script:

test_twitter_api.py:
import httplib

conn = httplib.HTTPConnection("api.twitter.com")
conn.request("GET", "/1/friends/ids.json?screen_name=twitter")
print("status: %d" % conn.getresponse().status)

Shared Networking in Parallels VM:

$ time python test_twitter_api.py
status: 200

real 0m10.135s
user 0m0.028s
sys 0m0.004s

Bridged Networking in Parallels VM

$ time python test_twitter_api.py
status: 200

real 0m0.167s
user 0m0.028s
sys 0m0.012s

The difference is dramatic – the call to conn.request() drops from 10 seconds to a fraction of a second. The above results also hold when just using curl to access the api:

curl --get 'http://api.twitter.com/1/friends/ids.json' \
     --data 'cursor=-1&screen_name=twitter' --verbose

Strangely, the issue only occurs when accessing the Twitter REST api – it doesn’t occur if you’re fetching other urls, or if you access the Twitter REST url from a browser.

So if you’re doing software development on a Parallels VM, make sure to enable Bridged Networking! (Click on the little web icon in the bottom right of your VM window, or go to Virtual Machine → Configure → Hardware → Network)

Upgrading to Ubuntu 12.04 on a VM

Here’s my experiences with upgrading from Ubuntu 11.04 to 12.04 on virtual machines (VirtualBox and Parallels), on a MacBook Pro.

VirtualBox

My initial setup was Ubuntu 11.04 running on VirtualBox 4.1.14 (the free/OSS VM software from Oracle/Sun).  Upgrading to 12.04 requires first upgrading to 11.10, so after snapshotting my VM, I attempted upgrading to 11.10 using Ubuntu’s GUI Update Manager.  Although the 11.10 packages all downloaded, and the upgrade nearly completed, the Update Manager went into a strange loop and kept popping up error windows, and the UI become unusable, forcing me to restart the VM.  On bootup, the VM froze at the Ubuntu splashscreen, so I restored my pre-upgrade snapshot and retried the upgrade process several times, each time failing identically.  After some investigating, it appeared that postgresql packages were causing the problem, so I used Synaptic package manager to find and uninstall all postgresql packages.  I then tried upgrading again to 11.10 — this time, the upgrade seemed to complete successfully, but on restart, the VM froze again at the Ubuntu splashscreen.  At this point I didn’t want to waste any more time, and decided to try out different VM software.

Parallels

The two alternatives to VirtualBox are of course VMWare Fusion and Parallels.  After reading some reviews, I decided to go with Parallels (I’d had a pretty good experience with an older version several years ago).  Although VirtualBox is open source (+1), it’s currently owned by Oracle (-1 (for their patent trolling)), so the thought of switching away didn’t bother me that much.  I was able to pretty easily import my VirtualBox VM into Parallels (using the ‘Add Existing Virtual Machine‘ feature, and selecting the appropriate .vbox file).  I then upgraded the Ubuntu install on the VM (running on Parallels) to 11.10 without incident.

I then upgraded to 12.04.  After the reboot to complete the 12.04 upgrade process, there were two problems:

  1. The error message “an error occurred while mounting psf” (meaning broken Parallels Shared Folders)
  2. The mouse not working in the VM

Fixing these problems required reinstalling Parallels Tools (without using the mouse!).  Luckily some keyboard shortcuts came in handy: after rebooting with the fresh 12.04 upgrade, use Ctrl-Alt-t to open a terminal window.  Then cd /media/Parallels\ Tools/, and run sudo ./install --install. Finally, sudo shutdown -r now, and you’ll reboot into a fully working Ubuntu 12.04 setup.

Javascript

The original promise of Java in the 90’s was to provide a ubiquitous write-once, run-anywhere platform for applications. With the rise of network-centric computing, where applications were increasingly delivered over the web rather than in shrink-wrapped boxes, the need for such a platform was growing tremendously. For numerous reasons, Java failed to live up to its original promise (although has successfully thrived in other server-side roles). But fast-forwarding to 2011, we ironically see that Javascript (long considered a toy language for gratuitous visual bells and whistles) nearly perfectly realizes Java’s original promise.

Because of its sheer ubiquity, immense resources have been thrown at Javascript interpreters in the past decade, resulting in extremely efficient JIT-based implementations (c.f., Google’s V8, which powers Chrome and node.js), to the point where it is now among the fastest dynamic languages around. Because of its ubiquity and highly optimized performance, it is increasingly being used as the underlying platform for complex applications, even those not hand-written in Javascript.

It has been referred to as the “Assembly Language of the Web“, because it is used as the target output for various compilers (e.g., Google’s GWT, CoffeeScript, Pyjamas). More radically, Emscripten actually provides an LLVM->javascript compiler. Since LLVM is in turn an intermediate format (IF) that can be generated with gcc from a variety of languages such as c/c++, this means Emscripten effectively allows c++ code to be compiled to Javascript! And taken a step farther, since interpreters for many interpreted languages (such as CPython for Python) are themselves written in c, which can be compiled to LLVM, which can be compiled to Javascript, this means languages like Python can be interpreted with a pure Javascript-based interpreter. It will be very interesting to see how this flexibility will influence the future of languages for client-side computation. But Javascript seems firmly entrenched as the base layer for making it happen.

Adding drop shadows to photos in Photoshop Elements

The below is for Photoshop Elements 8 (for Mac), although I assume the steps are largely similar in other versions of Photoshop.  The below assumes that in the end, you want a jpeg for a web page.

  1. Load your photo in Photoshop
  2. In the Layers pane, double click on the single layer that’s there corresponding to your photo.  When the dialog pops up, just click Ok to name your layer Layer 0.
  3. In the Layers pane (bottom right pane), in the bottom toolbar, click on the little circle icon (‘create new fill or adjustment layer’)
  4. Pick solid color.
  5. Enter the hex color of the background of the webpage that you will eventually put your photo on.
  6. In the Layers pane, drag your photo layer (Layer 0) to the top, so that it appear above your new Color Fill layer. If this doesn’t work, did you check to see that you followed Step 2?
  7. Go to Image->Resize->Canvas Size in the top menu.
  8. In New Size, choose pixels.  Add 100 pixels to width and height.  Click the top-left arrow under Anchor.  Click Ok.
  9. In the Layers panel, select your photo layer (Layer 0)
  10. In the Effects panel, choose Layer Styles (the little overlapping windows icon).  Then from the dropdown menu to the right, select Drop Shadows.  Select the drop shadow effect you want (probably low or high).  Click Apply.
  11. Your drop shadow should appear.  If you want to adjust the effect, double click on the little fx icon in your layer’s entry in the Layers panel to open the settings dialog for that effect.
  12. If you mess up, right click on your layer’s entry and Clear Layer Style.
  13. Save your psd file.
  14. Go to File->Save for Web, pick your preferred jpeg quality and image size, apply, and click Ok.

Note: if you want a png, then instead of creating a new fill layer in Step 6, you would add a normal layer, and make sure in Step 14 you save the png with transparency enabled.)