There is a computational photography technique in which a subject is placed within a sphere/dome of lights. The subject is then photographed with each light illuminated one at a time, resulting in a number of photographs equal to the number of lights. These photographs can then be composited together and modified in such a way as to simulate any possible lighting condition. This is achieved by taking a "lightprobe" of an environment and mapping it to the spherical dome of lights- such that each light matches the color and intensity of that corresponding space in the light probe. Or more directly, each photograph which corresponds to a particular light in the light dome is modified to appear as though that particular light had the color and intensity of that cooresponding space. This modification and mapping is done with each photo in the set, and they are all summed together into a single "Exposure", and you end up with virtual lighting.
Isn't it an interesting situation then, if you take this virtual lighting concept, and you combine it with the fact that modern macintoshes are equipped with a camera which is easily accessed from software, and facilities for taking screenshots of the current scene on the screen of the computer. These two could be combined into a sort of "light probe", such that a photographed object could appear on screen and look as though it were lit at the back by the other objects on screen, and lit at the front by the room surrounding the computer.
This illusion could be perfected by a measurement of the brightness of a typical computer screen, combined with exposure information from the iSight. These would be fairly standard across the models of macintosh equipped with built in isights.
Thus a dynamically lit widget could be produced, quite probably with the help of quartz compositor's easy access to isight, and easy integration into widgets.