What Have We Learned From the Google Web Accelerator?

At the beginning of May, Google launched their Web Accelerator beta. A couple of weeks later, the beta was suspended with the message "Thank you for your interest in Google Web Accelerator. We have currently reached our maximum capacity of users and are actively working to increase the number of users we can support." The Web Accelerator had been receiving a lot of criticism around the web because of some of the techniques it was using to speed up browsing, like pre-fetching. In theory, pre-fetching URLs is not a bad thing, and according to the RFCs, Google was doing the right thing. In practice, however, it caused all kinds of problems since browsers were pre-fetching links in admin applications that would do things like cause records to be automatically deleted or otherwise altered.

The problem is that, technically, we (meaning web developers) should not be using GET requests to alter data in any way. GETs should be used only to request documents and other resources, and should be entirely safe for pre-fetching. We should be using POSTs for requests that affect data, which Google’s web accelerator knows to stay away from. I’ve known this for some time, but I have generally disregarded this rule whenever it was convenient to do so. In fact, a couple weeks ago, I made a post entitled Why Distinguish Between GETs and POSTs? in which I provide a technique to combine FORM and URL ColdFusion variables into a single scope so that other ColdFusion code doesn’t have to distinguish between the two types of requests. The post got a lot of good comments about why that may not be such a good idea in a post Web Accelerator world, and I now believe this technique should only be used by people who really know what they are doing, and even then, sparingly.

Sometimes it takes events like the launch of the Web Accelerator to nudge the web forward. I’m not sure Google or anybody else is going to be able to get away with extensive pre-fetching anytime in the near future, but now that they have raised our awareness, hopefully we will work toward creating an environment where pre-fetching is safe. I’ve always known that using GETs when I should be using POSTs wasn’t kosher, but there were never any ramifications until now. Apparently, even Google is guilty of taking a shortcut here and there. I logged into Google Alerts this morning and found that I can delete and edit search terms by clicking on a simple link that performs a GET request. Good thing I didn’t have the Google Web Accelerator installed.

7 Responses to What Have We Learned From the Google Web Accelerator?

  1. Bryan Ledford says:

    Actually, on Google Alerts, the delete link displays a javascript confirmation prompt and the edit link displays a form which uses a POST and a submit button to perform the actual update. Both techniques are acceptable and encouraged in the RFCs.–Bryan

  2. PaulC says:

    I don’t think I’d want to use any web application with a pre-fetching system, best practices or not.What disturbs me is that somehow today’s broadband just wasn’t fast enough for someone, so a system like this was devised.Just another bottleneck or hacker target, I personally see no need.

  3. Raymond Camden says:

    I don’t quite get why POST should be the only thing you use for changing data. To me, POST and GET are simply two ways to make a request. Kind of like driving a car or flying a plane. Both can get you to a destination. Both ways of travelling have their own rules, benefits, etc. But the destination shouldn’t really matter. Shoot – what about systems that log your actions – in theory – those systems have data changed on every reqest. Should that all be POST traffic?

  4. Christian Cantrell says:

    Bryan, I’ll have to check out the RFCs again. The delete link in Google Alerts does prompt a JavaScript confirmation, but the URL in the anchor tag deletes the item using a GET (which I tested). Unless the Web Accelerator looks for onClick event handlers and explicitly ignores those links (which it might, but I read otherwise), it’s a problem.

  5. Ray: According to the fundamental tenets of web architecture, GET and POST *aren’t* simply two ways to make a request. They have very specific purposes… they’re basically the getter and setter of HTTP.Now technically, they’re supposed to be more like the SQL SELECT and INSERT of HTTP, with PUT doing UPDATE duties and DELETE, well, deleting. But early HTML kind of screwed that up by only supporting GET and POST, so here we are.But if you think about it in those terms, it should make a lot of sense. “Hypertext injection” can be just as dangerous as SQL injection, and properly utilizing HTTP’s verbs is the equivalent of leveraging cfqueryparam in your queries.To put it in basic terms, a GET should never perform an action that will have negative consequences if requested randomly and repeatedly.

  6. Roger: Now that is an interesting way of looking at it – and it does make more sense. Thanks.

  7. mark kruger says:

    So according to Bryon, on a delete it’s an ok standard to use a javascript confirmation coupled with a get? that disturbs me a little. What’s to prevent a prefetcher from following the link? I’d like to know what’s up on that one.