Home > Programming > JavaScript and the end of progressive enhancement

JavaScript and the end of progressive enhancement

May 4th, 2011

Progressive enhancement is the Right WayTM to do things in web development. It works like this:

  1. Write the HTML for your content, 100% semantically (i.e. only the necessary tags to explain the meaning of the content).

  2. Style the HTML using CSS. You may need to add hooks to your HTML in the form of classes and ids for the CSS to target.

  3. Add JavaScript enhancements to the interface, but only enhancements.

Parallel to this, the functionality of the site/app is progressively enhanced:

  1. All navigation must happen via links and form submissions – don’t use JavaScript for navigation.

  2. Add JavaScript enhancements to navigation, overriding the links and form submissions, for example to avoid page reloads. Every single link and form element should still work when JavaScript is disabled.

But of course you knew that, because that’s how you roll. I’m not sure this is the way forward though. To explain why I’ll need to start with a bit of (over-simplified) history.

A history lesson

Static pages

Originally, most sites on the web were a collection of static HTML pages saved on a server. The bit of the URL after the domain gave the location of the file on the server. This works fine as long as you’re working with static information, and don’t mind each file having to repeat the header, footer, and any other shared HTML.

Server-side dynamism

In order to make the web more useful, techniques were developed to enable the HTML to be generated on the fly, when a request was received. For a long time, this was how the web worked, and a lot of it still does.

Client-side dynamism

As JavaScript has become more prevalent on the client, developers have started using Ajax to streamline interfaces, replacing page loads with partial reloads. This either works by sending HTML snippets to replace a section of page (thus putting the burden of HTML generation on the server) or by sending raw data in XML or JSON and letting the browser construct the HTML or DOM structure.

Shifting to the client

As JavaScript engines increase in speed, the Ajax bottleneck comes in the transfer time. With a fast JavaScript engine it’s quicker to send the data to the client in the lightest way possible and let the client construct the HTML, rather than constructing the HTML on the server and saving the browser some work.

This raises an issue – we now need rendering code on the client and the server, doing the same thing. This breaks the DRY principle and leads to a maintenance nightmare. Remember, all of the functionality needs to work without JavaScript first, and only be enhanced by JavaScript.

No page reloads

If we’re trying to avoid page reloads, why not take that to its logical conclusion? All the server needs to do is spit out JSON – the client handles all of the rendering. Even if the entire page needs to change, it might be faster to load the new data asynchronously, then render a new template using this new data.

Working this way, a huge amount of functionality would need to be written twice; once on the server and once on the client. This isn’t going work for most people, so we’re left with two options – either abandon the “no reload” approach or abandon progressive enhancement.

The end of progressive enhancement

So, where does this leave things? It depends on how strongly you’re tied to progressive enhancement. Up until now, progressive enhancement was a good way to build websites and web apps – it enforced a clean separation between content, layout, behaviour and navigation. But it could be holding us back now, stopping the web moving forward towards more natural interfaces that aren’t over-burdened by its history as something very different to what it is today.

There are still good reasons to keep using progressive enhancement, but it may be time to accept that JavaScript is an essential technology on today’s web, and stop trying to make everything work in its absence.

Or maybe not

I’m completely torn on this issue. I wrote this post as a way of putting one side of the argument across in the hope of generating some kind of discussion. I don’t know what the solution is right now, but I know it’s worth talking about. So let me know what you think!

Appendix: Tools for this brave new world

Backbone.js is a JavaScript MVC framework, perfectly suited for developing applications that live on the client. You’ll want to use some kind of templating solution as well – I use jQuery.tmpl (I’ve written a presentation on the topic if you’re interested in learning more) but there are lots of others as well.

Sammy.js (suggested by justin TNT) looks like another good client-side framework, definitely influenced by Sinatra.

If anybody has any suggestions for other suitable libraries/frameworks I’ll gladly add them in here.


Edit: I’ve now written a follow-up to this post: The end of progressive enhancement revisited.

  1. Jason Persampieri
    May 5th, 2011 at 00:02 | #1

    If you are only talking about HTML->CSS->JS progressive enhancement, then I almost agree.

    A large enough majority of users will be able to use the fully featured site. Unfortunately, that changes significantly if you care about accessibility (which we all really should). Screen readers are notoriously bad at handling very dynamic pages. In the first case, you are excluding a small percentage of users who (let’s be honest) you likely won’t miss. In the second case, you are potentially excluding a very large percentage of users who can be active, contributing members.

    A related discussion, as @slicknet presented recently, is that there’s a whole other type of progressive enhancement, generally dealing with features of those same HTML/CSS/JS stacks. For example, it’s perfectly acceptable these days to only offer rounded corners to browsers that support the CSS “border-radius” rule. As long as things still work for the other users, you’re cool. However, if you have a page/application that requires localStorage… well, it’s all about the percentages, isn’t it?

    Along those lines, I’ve taken up the “pushState or bust” flag. For all those apps that have to send a skeleton file down, just to read the hash tag so it can then request the correct content from the server, I say “Fi!”. If you can navigate within your application using pushState, great… if not, just request the entire page anew.

  2. Nox
    May 5th, 2011 at 06:51 | #2

    What’s stopping you from using the same array of data used to generate JSON to use in a template system that generates a page server-side?

  3. May 5th, 2011 at 08:35 | #3

    Moving more and more js code to the client will win in my view, and the server’s role will be just to provide data. The problem for this to really take off is that applications are still in the hands of the data owner, and he decides how to show it to user, and what users can do with that data. This paradigm will die, and the user will have, in the end, total control of the data: he will be in charge of how to control it, how he sees it, how to manipulate it, how to mix it with other data from other servers. I am looking to the multitude of browser extensions, to apps like shiftspace.org, greasemonkey scripts …. and I think this is the future. It is much easier to integrate a third party app with a json/ajax/big fat client that “invites” to hacks than with a server side generated web page as most of them are now. The user wants to choose his data providers, app provider and interface provider, and control all his web experience. For this to happen, clientside js will do more and more of what serverside code used to do.

  4. May 5th, 2011 at 09:15 | #4

    OK, a few thoughts…

    I think many people are abandoning progressive enhancement of javascript, or even never adopted it in the first place and just said “this app needs javascript”

    Accessibility concerns aside it maybe an OK strategy for desktops but I think life is going to get more interesting as more devices become internet enabled.

    Depending on how big the HTML snippets being returned via AJAX are, then the size difference between HTML/JSON may well be trivial compared to the latency of actually making the request.

    Depending on what’s being generated for the AJAX requests e.g. are they cacheable for multiple visitors, then doing the bulk of the work on the server may result in a better experience for visitors.

    For me there’s no simple yes or no but a decision to be made based on the application / audience etc.

    Of course even once every one has javascript we had progressive enhancement of other technologies – CSS3, Canvas etc.

    Andy

  5. Morgan Cheng
    May 5th, 2011 at 10:48 | #5

    End of Progressive Enhancement?

    This sounds advocate of hash-bang which has been criticized a lot since lifehacker.com adopted it.

    Personally, I believe that “web application” can takes rich client way, but “web site” should still stick to progressive enhancement.

  6. May 5th, 2011 at 11:21 | #6

    Rendering to HTML on the server is almost never expensive, and if your HTML is really written “100% semantically” it’s not going to add a lot of data, either (especially when it’s gzipped).

    Transfer time is mostly an issue of latency, especially if you need to make multiple requests. If bandwidth really is your problem, though, and (big fat) HTML is just too much, you’re probably not going to be happy with (still rather portly) JSON either.

    The real solution almost every one of these “problems” is support for page diffs in HTTP. Here’s how things should work:

    1. You click on a link, sending a request to the webserver.

    2. The web-server sees your referer and figures out whether you support page-diffs (much like it figures out whether you support gzipped transfer.)

    3. If you don’t support it you get a regular page-load. If you do, the webserver sends you a page-diff.

    4. Your browser patches the page you’re looking at with the diff, changes your URL to the one you requested, updates your history etc etc.

    The whole point of all of these fashionable abuses is to make moving between similar pages easier, and people want that functionality so badly they’re willing to code it up in Javascript, tolerate ridiculous “History APIs” and half-break their websites to get it.

    If it’s worth doing, it’s goddamn worth doing it right.

  7. May 5th, 2011 at 11:32 | #7

    @Matthew Steel That’s a really interesting idea. The only problem I can see is that the majority of browsers support these “abuses” right now, whereas your page diff idea would require new browser versions (as far as I can tell) which is going to take time.

    Although, I suppose you could partially implement this scheme in JavaScript with a bit of work.

  8. justin TNT
    May 5th, 2011 at 12:04 | #8

    I think you’re over reacting : it’s only a small section of functionality that you’re seeing reproduced in the browser and the server. I’m a bit confused that this alone leads you to question progressive enhancement. One way to make it neater would be to find a serverside implementation of your templating engine do as Nox suggests. I think the tidiest solution though is when you have your serverside in javascript. With node.js on the server, I use the same function calls both in the server and the browser to fuse JSON on to my boilerplate. Yeah Backbone is cool. It’s really well suited to web apps with lotsa state info that isn’t clearly related to the DOM, but for building page-oriented websites I think Sammy is more suitable.

  9. May 5th, 2011 at 15:00 | #9

    Accessibility is surely better provided when clients can use alternative, independently developed JS applications with your data feed. But even if we stop writing plain HTML, we’ll need standard markup and interaction schemas, so JS app developers don’t need to figure how each website is naming article text in their JSON or which request they need to send for “next page” or “buy this” actions.

  10. Jane
    May 5th, 2011 at 17:01 | #10

    “If we’re trying to avoid page reloads, why not take that to its logical conclusion? All the server needs to do is spit out JSON – the client handles all of the rendering.”

    There is more than one logical conclusion!

    I would have said the logical conclusion is that you need to write the server in JS, and put JS rendering capabilities in the server, so you can still generate HTML on the server without breaking DRY. You can still do progressive enhancement, or you can generate new page loads — and you can pick one of these at runtime, and stay DRY!

    I’m sure there are cases where it is, but in none of my apps does the bottleneck happen to be transferring HTML or adding it to the DOM — it’s latency, either in my database or in the internet. So sending raw JSON isn’t terribly interesting to me.

    These days I’ve gotten so fed up with all the JS crap I see, I browse with JS turned off. There’s a few sites that don’t work right, but nothing that I need. There’s also many sites that work a lot better! I see JS as taking the same path that Java did once: originally hyped as a client-side language, but people discover that it’s actually pretty good on the server.

    Another possible conclusion to this: all browsers (including IE, since version 6!) have shipped with XSL engines for a while now., so if you want to send a single authoritative view of some data, and have the client render it into HTML, you can send raw XML, and it’ll work just fine even for those of us with JS turned off. You probably have (or could easily install) an XSL engine on your server. You can stay DRY, do rendering on the client, and work regardless of client-side JS! The only downside is needing to use XSL. :-)

  11. May 5th, 2011 at 17:53 | #11

    Nice article!

    Two thoughts 1) One downside of client heavy rendering is that almost inevitably the JavaScript gets bloated, ugly and buggy. JavaScript is my passion but I’ll be the first to admit its primary purpose (in web pages) should be event handling. We have html/css for rendering. Using templates is a compromise – but even these can get gnasty

    2) With a well factored markup layer you can stay DRY even with client-side dynamism. Its possible to have Ajax partial reloads access a subset of the same markup that loaded the full page on startup

  12. May 5th, 2011 at 21:21 | #12

    I feel your sentiment, and I do sometimes think along the lines that you do, however, the biggest problem is that Javascript needs to “run”. And it stops “running” waaaaay too often – and no – I’m not talking about the [supposed] 2% who have it disabled completely. I’m talking about it stopping to work. It may be because of a runtime error. It may be because of an unhandled XHR timeout. It may be because it’s slow to load. It may be because you’re on mobile (I use Android and I suffer from that), you navigated off, and then you clicked back. It may sometimes even happen because you hit “Stop” in the browser. Fact is – JS stops working. And when it does – you need the user action to be completed, which in turn usually reloads the page and “restarts” JS.

    But the stuff I just said above is not about Progressive Enhancement, I guess. It is about QA – how well prepared are you against random failures? And how hard is you[r team] trying to ensure that things work even when they fail? And how much money do you need to invest into developing and testing this and how much money will you get out of the increased customer satisfaction?

  13. May 5th, 2011 at 21:40 | #13

    I’m absolutely with you on this one, and been meaning to write a similar post for months. Progressive enhancement is not an axiomiatic standard, and we should question when it no longer becomes the best way to think about the certain aspects of the web.

    Shipping the entire UI to an application in an HTML document every time a user performs an action (i.e. clicks a link) is just architecturally ridiculous. AJAX was a toe in the water to address this. But running fully-fledged MVC apps on the client-side is now not only feasible, but, with mobile in particular, almost a default architecture for anything more than the simplest brochure-like site. Many mobile web apps don’t even have a server.

    The assumptions that you can make about the runtime environment in contemporary browsers are way beyond what existed when progressive enhancement was the rage. We know that all of today’s smart phone platform browers, for example, are capable of running rich, disconnectable, thick client apps. Why on earth use HTML – or even the DOM – to store the data these apps require on the highly unlikely basis that someone might be using Lynx or somesuch to visit your site?

    (Along similar lines there’s a whole argument about SEO and crawlability of progressively enhanced markup. This is a weak point, I think, and more a reflection of search engines’ inability to index app functionality as well as they can keywords in a document: hence human curated app stores in the meantime)

    Actually what we’re maybe observing here is a clash of civilisations: the documentistas vs the applicationistas; web-as-a-medium vs web-as-a-technology-stack; designers vs programmers. It’s no wonder that progressive enhancement is a powerful tenet for the former of each pair to rally around… but not necessarily the most architecturally important consideration of the latter.

    (Crikey… epic comment. Maybe there’s a blog post in there for me after all)

  14. Michael Warkentin
    May 5th, 2011 at 22:00 | #14

    Sproutcore (http://www.sproutcore.com/) takes this concept pretty far, moving pretty much everything to the client, and using the server as a data store.

  15. Martin
    May 6th, 2011 at 01:46 | #15

    Very good post!

    I am currently looking at building a web app using a server that just spits out JSON and using JavaScriptMVC http://www.javascriptmvc.com to construct the webpages.

  16. May 6th, 2011 at 07:45 | #16

    This is really a hard topic, because both approaches have their merits. For example, on several occasions, I have had to resort to using command line browsers during the installation stage of an OS to search for fixes for a problem I’ve encountered. If everything was using JavaScript, then I would be stuck until I could find another (working) computer to work off. Also, some people disable JavaScript by default for security purposes, which can be sensible albeit a bit paranoid. Still, JavaScript does come with a host of potential security problems, and many developers are not aware of them.

    On the other hand, neither of these situations are very common, and one could argue that it would be better to improve JS security and perhaps even implement JavaScript in the command line browsers (or at least one of them) to enable developers to utilize the power and architectural benefits that AJAX (and related technologies) provide. In fact, this might be the way to go, but no matter the approach, some people will always feel cheated and left behind.

  17. May 6th, 2011 at 09:24 | #17

    While I agree javascript is a really big part in nowadays application, I’m not very found of this idea. I love to write heavy mvc javascript layer, still I always ensure the feature is in the server side application.

    Why? Because I control the server. I know that if a server side feature works for me, it will work for everyone. Even with heavy javascript enhancements, if something get wrong, the server side feature will serve as fallback (this is something to think about will designing the js architecture, though).

    We simply can’t rely on the fact that every single client will run our code as expected. Personally, I like the idea that even in that case, the application will degrade in something still usable.

  18. May 6th, 2011 at 09:48 | #18

    Progressive enhancement and no reload pages can work together, however it’s very difficult to get that with currently most used web frameworks. But imagine framework that for backend and frontend uses JavaScript, with such architecture it can be done neatly.

  19. Joeri Sebrechts
    May 6th, 2011 at 12:48 | #19

    You have to follow this line of thinking to its conclusion. If you bring the rendering to the client, you don’t have to think in terms of json arrays and templates, but you can think in terms of datasets, components and events. See for example the Ext JS component model, which abstracts away the DOM. This in turn opens the door for RAD environments similar to what we know on the desktop, where you configure data stores coupled to web services, drop a few components on a form, and link the stores to components. Developing in this way scales more easily. If you load the components via an on-demand infrastructure, you can scale to apps with a million lines of javascript code, all in a single page, without it becoming a maintenance nightmare.

    The server layer in turn becomes a collection of web services. This lets you use the exact same server-side code for both your own app as your 3rd party apps (or B2B interfaces). Advantages: DRY, easier to secure (fewer entrypoints). And then there’s the question of going offline. If all your rendering is client-side, going offline is much easier. This makes mobile development easier, because you don’t have to rely on internet connectivity at all times.

    So, if you do decide that javascript is mandatory, and follow it to its logical conclusion, it opens possibilities that don’t exist in a progressive enhancement landscape. For my apps this made sense, so I’ve been going down this road for 3 years now. It’s been a very positive experience. YMMV.

  20. Alexander Romanenko
    May 8th, 2011 at 21:28 | #20

    I will pull a complete “dare” on this one and say that HTTP has reached its limits, or about to. It was never intended for 2 way communication and these AJAX/Comet hacks (to a certain extent) are being overstretched into unmanageable and fragile solutions. We are forced to break up a well working presentation vs. business logic architecture that works so well on a desktop or desktop intranet server infrastructure.

    I hope that the websocket security issues are going to be worked out soon enough and we have an opportunity to write a truly MVC compliant software where updates are driven through Observer Pattern between server and browser. There will be no overhead on network recreating the connection all the time. Maybe servers will not have to worry about generating JSON or HTML – maybe just send an optimized binary blurb that JS can understand natively (like AMF in ActionScript).

  21. May 8th, 2011 at 23:09 | #21

    I think devs are too hung up on making sure sites work on the minuscule percentage of people with JS turned off. Most web users can’t even properly define what a browser is, let alone turn their JS off.

    My sites have been heavily JS-based for years and maybe five people have complaIned tops. Making sure your site works in double the scenarios is too expensive for your business. Not enough bang for the buck.

  22. May 8th, 2011 at 23:32 | #22

    I absolutely agree with the problem of writing functionality multiple times, once on the server and again on the client. Its crazy to have to implement things like templating, validation, URL routing twice… My approach has been to write a framework that re-uses this code at both ends. If you’re interested, take a look at http://kansojs.org

  23. May 9th, 2011 at 01:00 | #23

    I totally agree with you. We use Extjs more as the view, but the server (codeigniter) is the controlling boss. Don’t like it when the client becomes smart.@olivier el mekki

  24. May 9th, 2011 at 12:06 | #24

    Then of course you have the search engines to take into consideration… ;)

  25. Nicolas
    May 9th, 2011 at 12:44 | #25

    @Jane You should wonder why nobody is using JS on server side (yes I know Rhino, I know node.js too).

    The main thing is that current javascript implementation is an horror and a mess. It miss a real standard high level API (like what we have in JAVA or C#), it lack structure like packages, class or interfaces. Dynamic typed language are good for fast prototyping but are not the prefered path for enterprise programming where types information is invaluable.

    Thus nobody want to use javascript for some real work. And most advenced web UI use big and complex frameworks to bypass all theses limitations. Basically either your are google or microsoft and you can do google docs, gmail etc… Or you don”t do it. That what javascript bring to us. You need a billion dollar making company to make a decent client UI…

  26. May 9th, 2011 at 13:34 | #26

    KnockoutJS eases a lot the pain of writing RIA apps with Javascript, its data-binding framework is marvelous! Check it out on http://www.knockoutjs.com

  27. May 9th, 2011 at 13:46 | #27

    Progressive Enhancement should still have a role due to: The separation of layers: html ,css, js – you can have several different js for ux on various devices (mobile/pads/desktop) and different css as well.

  28. Jens
    May 11th, 2011 at 15:49 | #28

    I don’t think there’s a need to abandon progressive enhancement. The theory and method just needs to be updated. At it is as we speak. Using Test and capability based progressive enchancement using a kind of “X-ray vision” as described in this book: http://www.filamentgroup.com/dwpe/.

    It is important to note that a user lacking full JS-support probably doesn’t need alot of CSS and enhancement junk anyway, someone with so low support for web techniques needs something more quick and simple. And that’s were the two-layered “Basic” vs “Enhanced” progressive enhancment comes in (more of that in the above mentioned DWPE link).

    Also, don’t forget that progressive enhancement is tied to accessibility, and you need to describe you application with WAI ARIA (http://www.w3.org/WAI/intro/aria.php) if you are the least interested in accessibility. Some of components of jQuery UI does that for example.

    With a two-layered “Basic” and “Enhanced” versioned site combined with media-queries and WAI-ARIA you get a highly usable and accessible site. And yes, we need to learn new things, but the varierty of devices where using to access the web is increasing, and we need to deal with that. By evolving the hard way.

  29. July 3rd, 2013 at 12:20 | #29

    I think this is among the most vital information for me. And i’m glad reading your article. But should remark on some general things, The site style is great, the articles is really nice : D. Good job, cheers

  30. August 1st, 2013 at 01:02 | #30

    I’m extremely impressed with your writing skills as well as with the layout on your blog. Is this a paid theme or did you customize it yourself? Either way keep up the excellent quality writing, it is rare to see a nice blog like this one today.

  31. August 5th, 2013 at 19:14 | #31

    Undoubtedly such bi-folding shower doors have made a lot of interesting uses for old cabinet doors. The interlinking of the world they are these huge imposing oval shaped rooms taking up about a quarter of the carriage is put out of commission. For this same reason, it is still hanging, cleaned and repainted or refinished with little difficulty. If you prefer to go in order of their importance on the timeline, as they have the ability to foresee certain events and to move external objects with the power of thought.

  32. August 6th, 2013 at 10:49 | #32

    I always emailed this webpage post page to all my associates, for the reason that if like to read it after that my contacts will too.

  33. August 17th, 2013 at 01:25 | #33

    ネックレス tiffany

  34. September 4th, 2013 at 17:23 | #34

    Greetings! Very helpful advice within this post! It’s the little changes that will make the most important changes. Thanks a lot for sharing!

  35. September 8th, 2013 at 02:01 | #35

    Very nice write up, i surely love this website, continue the good work!

    Look into my webpage: web page ([http://rodneymiller1970.wordpress.com/](http://rodneymiller1970.wordpress.com “http://rodneymiller1970.wordpress.com/”))

  36. September 23rd, 2013 at 15:47 | #36

    I am not sure where you are getting your information, but good topic. I needs to spend some time learning more or understanding more. Thanks for magnificent info I was looking for this info for my mission.

  37. September 23rd, 2013 at 23:47 | #37

    Of course by virtue of the era and the relatively meager scientific and medical resources available then compared to today, not to mention the general lack of solid information regarding acne treatment, many of these remedies were nothing more than a harmless concoction at best and at worse they could actually cause serious injury and permanent disfigurement. Acne – the one word that many people around the world has suffered with throughout their whole life. To be precise and specific the Eczema symptoms can be the following:.

  38. September 24th, 2013 at 19:05 | #38

    I like what you guys are up too. This kind of clever work and exposure! Keep up the very good works guys I’ve incorporated you guys to my blogroll.

  39. September 29th, 2013 at 16:37 | #39

    With a stunning lack of insight into human relationships, many marriage and family therapists (MFT) now trip over each other to espouse such dubious claims such as “divorce proof your marriage” or “get your spouse back” or the absolutely ludicrous “your marriage can survive an affair and be even stronger. As you can see art can help you heal your heart after a broken relationship. Words are like seeds, when they are spoken over us, whether it is by someone else or by ourselves they bring a harvest.

  40. September 30th, 2013 at 20:51 | #40

    Very nice post. I just stumbled upon your weblog and wanted to say that I have really enjoyed surfing around your blog posts. After all I will be subscribing to your feed and I hope you write again soon!

  41. October 6th, 2013 at 19:13 | #41

    People can not put all the pieces together and be successful. Ignore the rest of the steps until you complete the task at hand. How many times have you lost weight and put it back on again.

  42. October 17th, 2013 at 20:53 | #42

    Agility is the watchword eresource offers web-based Enterprise Resource Planning software for all major industries including Process Manufacturing, Discreet Manufacturing, Construction, Fleet Management and Modular Kitchen etc. Both Dota 2 and League of Legends are free-to-play MOBAs so honestly it comes as no surprise to see them being played by more people than World of Warcraft, which still charges monthly fees. Just remember the 2 important ideas– accomplish not feed as well as always seek to receive last hit on the monsters.

    Look into my web blog … [selling league of legends account na](http://my.opera.com/eloboost/blog “selling league of legends account na”)

  43. October 18th, 2013 at 08:41 | #43

    As an example, you can look at joining the neighborhood gym to become a lot more fitter and stronger or you might even elect to get involved with active community service. Therefore, in such a situation, if you try to call her or meet her, she might likely decline all your initiatives, also you are going to appear extremely needy and desperate to her, thus, pulling her away from you even more. This is what I like to describe as “planting the seed”.

  44. October 19th, 2013 at 09:07 | #44

    We’re a group of volunteers and starting a brand new scheme in our community. Your website offered us with valuable info to work on. You’ve performed a formidable proces and our entire neighborhood will likely be grateful to you.

    my page: group insurance quotes ([http://cblanparty.org/what-to-do-in-the-region-of-group-health-insurance-from-a-subsequent-10-mins/](http://cblanparty.org/what-to-do-in-the-region-of-group-health-insurance-from-a-subsequent-10-mins/ “http://cblanparty.org/what-to-do-in-the-region-of-group-health-insurance-from-a-subsequent-10-mins/”))

  45. October 31st, 2013 at 00:15 | #45

    Hello! I’ve been following your weblog for some time now and finally got the courage to go ahead and give you a shout out from Atascocita Tx!

    Just wanted to say keep up the great job!

  46. November 8th, 2013 at 08:33 | #46

    Remarkable! Its in fact remarkable paragraph, I have got much clear idea regarding from this article.

    my web-site … [penile extender](http://afzaz.org “penile extender”)

  47. November 8th, 2013 at 21:33 | #47

    Members that do this are also jeopardizing the integrity of this unbelievable diet. This best priced easy diet to lose weight really fast, has managed to harness the power of your body’s metabolism. You won’t be hungry and you will end up eating more with less calories.

  48. November 20th, 2013 at 01:04 | #48

    Also, do take note that alcohol, cigarettes, artificial sweeteners and refined sugars are calcium inhibitors. The first thing that comes into effect is to opt for a wholesome diet plan having foods rich in vitamins, minerals, calcium and other necessary health elements. Also, your diet as well plays a great role in your height.

  49. November 24th, 2013 at 05:30 | #49

    ” for more information about the tea’s slimming effects. Other benefits derived from cardiovascular exercise include: Reduced risk of life-threatening cardiac events, such as a heart attack; Amelioration of risk factors for chronic disease, such as high cholesterol, high blood pressure, obesity and insulin resistance. Just like any diet complement, you may not discover how the body will react.

  50. December 14th, 2013 at 07:11 | #50

    Therе are many herbal remedies for premature ejaculation that are healthy for the body and solve all medіcal рroblems without having to take medіcines. To understand this you have to realize what hapρens as you go from non arοusal to a screaming orgaѕm. By meditating regularly you will notice a much calmer ѕense of self.

Comment pages
1 2 1071
Comments are closed.