Here’s a quick tip if you happen to use remote desktop.
If you’ve got a monitor of particularly high resolution (such as 2560 x 1440 in the case of my monitor at work) then modern desktop or laptop computers should have little problem driving the pixels necessary to make it work properly. However, if you remote desktop into servers which often don’t have a monitor connected you might run into problems maintaining colour depth at the native resolution of your monitor.
This is usually because servers set aside only a modest amount of system memory for video memory. Very few servers would have discrete video cards in them instead opting for low cost and modest integrated solutions such as those available from Intel. These integrated GPUs would be able to achieve a particular resolution at maximum colour depth (say 1600 x 900 resolution @ 32-bit colour) or sacrifice colour depth for higher resolution (maybe 1920 x 1080 @ 16-bit colour)..
Essentially, for a limited amount of memory set aside for video you will need to sacrifice colour-depth for resolution. You can do some quick sums to work out how much you need by multiplying the horizontal and vertical resolution by the colour depth.
For instance:
- 2560 x 1440 @ 32-bit:
- 2560 x 1440 x 32 = 117,964,800 bits
- 117,964,800 / 8 = 14,745,600 bytes
- 14,745,600 / 1024 / 1024 = 14.0625MB
- 2560 x 1440 @ 256 colours (8-bit colour)
- 2560 x 1440 x 8 = 29,491,200 bits
- 29,491,200 / 8 = 3,686,400 bytes
- 3,686,400 / 1024 / 1024 = 3.515625MB
In the early days, video memory was expensive to manufacture in vast quantities but these days RAM may as well grow on trees as systems usually come with gigabytes for system and video usage. With integrated GPUs you should at least have the option of increasing the amount of system RAM to set aside for video memory but as long as you can spare at least 32MB to 64MB (to provide enough room for a few video frames to be rendered behind the currently displayed frame) then you should be fine in most instances.
Follow Us!