The short answer on this one is that, no, for the sake of accuracy, you probably should not charge a full year of depreciation when you owned the asset for only part of the year.
A good question is, how accurate should you be?
To be right on the nose, of course you would count how many days you used the asset in the year of purchase, and allocate depreciation expense accordingly. Thus, if my organization uses a calendar year for its fiscal year (i.e. January to December), and I bought a new computer on May 22, I owned the computer for 223 days out of 365.
Now, remember, useful life is an estimate: you don’t know how long you will actually own your new purchase. So, it’s generally not considered necessary to be quite that particular about measuring depreciation expense.
One common method would be to go by the month of purchase. So, if I bought the computer during May, I would take 8 months of depreciation expense – that is, 8/12 of the full year’s depreciation cost.
Another common method is the “half-year rule.” Under this method, for every asset you buy, you take 6 months of depreciation in the year of purchase. The thinking is that if you do this consistently over a period of years, your total depreciation cost will more or less even itself out. You may under-depreciate some purchases, but you’ll over-depreciate others, and at the end of the day your depreciation expense for any given year will be within the zone of reasonable – plus there’s less finicky bookkeeping to do.
Remember, the last year of depreciation must make up the compensating time period. So, if I’m depreciating a computer over five years and I take eight months of depreciation expense in the year of purchase, I need a four-month period at the tail-end to make a full five years. Thus, six fiscal years would be affected:
- Year 1 – 8 months’ depreciation expense
- Year 2 – a full year of depreciation expense
- Year 3 – ditto
- Year 4 – ditto
- Year 5 – ditto
- Year 6 – 4 months’ depreciation expense