The main requirement of any programmer is unlimited amount of fast memory. But the speed of memory is directly proportional to the cost of memory. But our goal is to achieve more speed in less cost, for these we have introduce memory hierarchy. At the top of this hierachy is small but very fast and costly cache memory. As cache is small memory so it contain only some part of data at any given time. If some new data is to be placed in cache than we have to replace the existing data (in case when cache is packed with data) using some strategies. LRU (Least Recently Used) is one such strategies. In this paper we have a new extension to LRU depending on the size of the block to be replaced